I am doing some iteration over an array which have another set of array (the nested array). I need to .map()
the outer array in such a way that it should filter out the nested array based on some criteria.
Following is the example:
JSON
[{
"id": "CAM000001",
"type": 128,
"name": "abc",
"fieldSets":
[
{
"fields":
[
{
"entity_name": "abc_id",
"type": "String",
"value": ""
},
{
"entity_name": "abc_name",
"type": "String",
"value": "XYZ Inc."
},
{
"entity_name": "created_on",
"type": "Date",
"value": "09/20/2016"
}
]
}
]
}]
Code
datas = datas.map(data => {
data.fieldSets[0].fields = data.fieldSets[0].fields.filter(field => {
return field.entity_name === 'abc_name';
});
return data;
});
I searched a bit, and it seems like above code has time complexity of \$\mathcal{O}(n^2)\$ (I am still learning about time and space complexity, please correct my understanding if it's wrong).
So, considering the large datasets, if fields
(nested array) and datas
(parent array) is growing in size, it would cost much. So can you please help me to understand what could be the best possible solution to avoid worst time complexities? Is whatever I am doing here correct?
map
is misleading, you should usedatas.forEach(...)
or remove the inner fields assignment and return a new object. \$\endgroup\$