My Products
Help
Florian Haase
PARTNER

Multirow create by GraphQL and return-handling

by Florian Haase

Hi

 

now we have implemented from our site multirow import through the Graphql (that means collecting several rows to 1 query) to avoid quota-limits in the import.

 

But we don't know how to handle the result:

 

1) The error-returns have a row reference, so we now which row we did send in caused an error - thats okay - we can mark our import-rows with this error-message - perfect

 

2) But we want at the same time update our importrows with the value-suggestion from BNXT (feks when we are sending in an associate and suggest the associateNo we want to use the response to update our Importsource with this number. And here the trouble starts: For us it seems like that some error situations exclude the row from the items-result and some do not. How can we in a secure way reference the item-result back to the values/rows we have sent to the API? We don't know either if it is guarantied that the item result comes in the exact same order as the rows we have sent in - can you confirm that?

 

When we are sending one by one row we get 1 answer and eventually 1 error message - than we have a 100% reference to our import and we have no problems - but than we quite sure will hit the quota-limits.

 

Florian

 

6 REPLIES 6
Florian Haase
PARTNER

by Florian Haase

Hi

 

we still have no good solution for that. Therefore we have to go back to sending one by one record for update. We also have to send one by one for insert there we need the return values for each record since we can't find a proper way to reference the answers to the requests. For inserts where we don't need the suggested values from BNxt we will keep sending the requests collected.

 

Florian

Florian Haase
PARTNER

by Florian Haase

Here another problem with the Insert:

 

The documentation says that the affectedRows property gives the number of rows which are succesfully inserted. 

It's also documented that the items returns all successfully imported rows:

 

When we try to import 20 customers we get the following answer:

{
"errors": [
{
"message": "Error: The value is already in use.",
"path": [
"useCompany",
"associate_create",
"values/0"
],
"extensions": {
"data": {
"status": 0
}
}
},
{
"message": "Error: The value is already in use.",
"path": [
"useCompany",
"associate_create",
"values/0"
],
"extensions": {
"data": {
"status": 0
}
}
},
{
"message": "Error: The value is already in use.",
"path": [
"useCompany",
"associate_create",
"values/1"
],
"extensions": {
"data": {
"status": 0
}
}
},
{
"message": "Error: The value is already in use.",
"path": [
"useCompany",
"associate_create",
"values/1"
],
"extensions": {
"data": {
"status": 0
}
}
},
{
"message": "Error: The value is already in use.",
"path": [
"useCompany",
"associate_create",
"values/2"
],
"extensions": {
"data": {
"status": 0
}
}
},
{
"message": "Error: The value is already in use.",
"path": [
"useCompany",
"associate_create",
"values/2"
],
"extensions": {
"data": {
"status": 0
}
}
},
{
"message": "Error: The value is already in use.",
"path": [
"useCompany",
"associate_create",
"values/3"
],
"extensions": {
"data": {
"status": 0
}
}
},
{
"message": "Error: The value is already in use.",
"path": [
"useCompany",
"associate_create",
"values/3"
],
"extensions": {
"data": {
"status": 0
}
}
},
{
"message": "Error: The value is already in use.",
"path": [
"useCompany",
"associate_create",
"values/4"
],
"extensions": {
"data": {
"status": 0
}
}
},
{
"message": "Error: The value is already in use.",
"path": [
"useCompany",
"associate_create",
"values/4"
],
"extensions": {
"data": {
"status": 0
}
}
},
{
"message": "Error: The value is already in use.",
"path": [
"useCompany",
"associate_create",
"values/5"
],
"extensions": {
"data": {
"status": 0
}
}
},
{
"message": "Error: The value is already in use.",
"path": [
"useCompany",
"associate_create",
"values/5"
],
"extensions": {
"data": {
"status": 0
}
}
},
{
"message": "Error: The value is already in use.",
"path": [
"useCompany",
"associate_create",
"values/6"
],
"extensions": {
"data": {
"status": 0
}
}
},
{
"message": "Error: The value is already in use.",
"path": [
"useCompany",
"associate_create",
"values/6"
],
"extensions": {
"data": {
"status": 0
}
}
},
{
"message": "Error: The value is already in use.",
"path": [
"useCompany",
"associate_create",
"values/7"
],
"extensions": {
"data": {
"status": 0
}
}
},
{
"message": "Error: The value is already in use.",
"path": [
"useCompany",
"associate_create",
"values/7"
],
"extensions": {
"data": {
"status": 0
}
}
},
{
"message": "Error: The value is already in use.",
"path": [
"useCompany",
"associate_create",
"values/8"
],
"extensions": {
"data": {
"status": 0
}
}
},
{
"message": "Error: The value is already in use.",
"path": [
"useCompany",
"associate_create",
"values/8"
],
"extensions": {
"data": {
"status": 0
}
}
}
],
"data": {
"useCompany": {
"associate_create": {
"affectedRows": 20,
"items": [
{
"externalId": "",
"customerNo": 0,
"name": "Arne Bergfjord"
},
{
"externalId": "",
"customerNo": 0,
"name": "Milaim Imeri"
},
{
"externalId": "",
"customerNo": 0,
"name": "Leif E. Tømmervåg"
},
{
"externalId": "",
"customerNo": 0,
"name": "Supaluck Antarit"
},
{
"externalId": "",
"customerNo": 0,
"name": "Jan Anders Hansen-Zahl"
},
{
"externalId": "",
"customerNo": 0,
"name": "Knut Kristen Kuløy Kjenstad"
},
{
"externalId": "",
"customerNo": 0,
"name": "Alf Rune Totland"
},
{
"externalId": "",
"customerNo": 0,
"name": "Nils Harald Østevold"
},
{
"externalId": "",
"customerNo": 0,
"name": "Kjell Arne Tokheim"
},
{
"externalId": "1235",
"customerNo": 7001235,
"name": "Svein-Ivar Holm"
},
{
"externalId": "1243",
"customerNo": 7001243,
"name": "Asbjørn Lochert"
},
{
"externalId": "2995",
"customerNo": 7002995,
"name": "Geir Skarstein"
},
{
"externalId": "740328",
"customerNo": 7740328,
"name": "Evelyn Olafsen"
},
{
"externalId": "1289",
"customerNo": 7001289,
"name": "Sverre Jørgensen"
},
{
"externalId": "650024",
"customerNo": 7650024,
"name": "Bjørn Mo"
},
{
"externalId": "",
"customerNo": 0,
"name": "HR Danmark"
},
{
"externalId": "460011",
"customerNo": 7460011,
"name": "Ola Sukka"
},
{
"externalId": "460021",
"customerNo": 7460021,
"name": "Kristian Reitan"
},
{
"externalId": "",
"customerNo": 0,
"name": "HR Sogn"
},
{
"externalId": "",
"customerNo": 0,
"name": "HR Vestfold"
}
]
}
}
},
"extensions": {
"vbnxt-trace-id": "0000000000000000a38b590571dc5a3a"
}
}  

 20 affected Rows and 20 Items but a bunch of error-messages (because the customers already exists).

 

Would we always get all rows back (also the failed ones) or is this just on the associate table or is this a bug?

 

Florian

 

Hi,

I did get the same errors, what i found out it's the field externalId must have unique value.

So when trying to save empty string, i think either the api or model looks at it as same value


20 affected Rows and 20 Items but a bunch of error-messages (because the customers already exists).

 

Would we always get all rows back (also the failed ones) or is this just on the associate table or is this a bug?


This is what we get back from the backend. There is no data created in GraphQL. GraphQL is just a middleman that transforms the request into something the backend understands and then back the response into the shape that was requested.

omelhus
PARTNER

by omelhus

It seems like there should be native support for batching in graphql-dotnet (https://github.com/graphql-dotnet/server/pull/241/files), but the API returns with a 500 error when I try it. Maybe there is some kind of body validation in nginx or in a middleware that breaks the functionality?

omelhus
PARTNER

by omelhus (Updated ‎31-05-2023 11:09 by omelhus PARTNER )

The best way to solve this is still to use operation batching (https://community.visma.com/t5/Diskusjon/Please-enable-operation-batching/td-p/580571), but it doesn't work with the API as of yet. 

 

I've found an open GraphQL that supports batching in a sensible way. @Marius Bancila have a look at how they've solved batching in https://countries.trevorblades.com/graphql using the following body:

 

 

[{
    "query": "query Query($country: ID!) { country(code: $country) { name } }",
    "variables": {
        "country": "NO"
    }
},
{
    "query": "query Query($country: ID!) { country(code: $country) { name } }",
    "variables": {
        "country": "SE"
    }
},
{
    "query": "query Query($country: ID!) { country(code: $country) { name } }"
}]

 

 

The last query will fail since it's missing "variables". This will return the following response:

 

 

[
    {
        "data": {
            "country": {
                "name": "Norway"
            }
        }
    },
    {
        "data": {
            "country": {
                "name": "Sweden"
            }
        }
    },
    {
        "errors": [
            {
                "message": "Variable \"$country\" of required type \"ID!\" was not provided.",
                "locations": [
                    {
                        "line": 1,
                        "column": 13
                    }
                ]
            }
        ]
    }
]

 

 

I think this is a sensible approach.