I have a JavaScript array like:
[["$6"], ["$12"], ["$25"], ["$25"], ["$18"], ["$22"], ["$10"]]
How would I go about merging the separate inner arrays into one like:
["$6", "$12", "$25", ...]
ES2019 introduced the Array.prototype.flat()
method which you could use to flatten the arrays. It is compatible with most environments, although it is only available in Node.js starting with version 11, and not at all in Internet Explorer.
const arrays = [
["$6"],
["$12"],
["$25"],
["$25"],
["$18"],
["$22"],
["$10"]
];
const merge3 = arrays.flat(1); //The depth level specifying how deep a nested array structure should be flattened. Defaults to 1.
console.log(merge3);
For older browsers, you can use Array.prototype.concat
to merge arrays:
var arrays = [
["$6"],
["$12"],
["$25"],
["$25"],
["$18"],
["$22"],
["$10"]
];
var merged = [].concat.apply([], arrays);
console.log(merged);
Using the apply
method of concat
will just take the second parameter as an array, so the last line is identical to this:
var merged = [].concat(["$6"], ["$12"], ["$25"], ["$25"], ["$18"], ["$22"], ["$10"]);
var merged = [].concat(...arrays)
Array.prototype.concat(...arrays)
. This version works with Typescript's 2.3.0 --strict
mode. Doesn't work with nested arrays (it's not recursive).
Commented
May 16, 2017 at 8:59
Here's a short function that uses some of the newer JavaScript array methods to flatten an n-dimensional array.
function flatten(arr) {
return arr.reduce(function (flat, toFlatten) {
return flat.concat(Array.isArray(toFlatten) ? flatten(toFlatten) : toFlatten);
}, []);
}
Usage:
flatten([[1, 2, 3], [4, 5]]); // [1, 2, 3, 4, 5]
flatten([[[1, [1.1]], 2, 3], [4, 5]]); // [1, 1.1, 2, 3, 4, 5]
flat
in the first call to the anonymous function passed to reduce
. If it is not specified, then the first call to reduce
binds the first value out of the array to flat
, which would eventually result in 1
being bound to flat
in both the examples. 1.concat
is not a function.
Commented
Nov 18, 2015 at 2:32
const flatten = (arr) => arr.reduce((flat, next) => flat.concat(next), []);
Commented
Aug 8, 2016 at 14:18
const flatten = (arr) => arr.reduce((flat, next) => flat.concat(Array.isArray(next) ? flatten(next) : next), []);
There is a confusingly hidden method, which constructs a new array without mutating the original one:
var oldArray = [[1],[2,3],[4]];
var newArray = Array.prototype.concat.apply([], oldArray);
console.log(newArray); // [ 1, 2, 3, 4 ]
[].concat(...[ [1],[2,3],[4] ])
.
flat = (e) => Array.isArray(e)? [].concat.apply([], e.map(flat)) : e
Commented
Jul 30, 2018 at 23:00
It can be best done by javascript reduce function.
var arrays = [["$6"], ["$12"], ["$25"], ["$25"], ["$18"], ["$22"], ["$10"], ["$0"], ["$15"],["$3"], ["$75"], ["$5"], ["$100"], ["$7"], ["$3"], ["$75"], ["$5"]];
arrays = arrays.reduce(function(a, b){
return a.concat(b);
}, []);
Or, with ES2015:
arrays = arrays.reduce((a, b) => a.concat(b), []);
arrays.reduce((flatten, arr) => [...flatten, ...arr])
Commented
Feb 19, 2018 at 14:51
There's a new native method called flat to do this exactly.
(As of late 2019, flat
is now published in the ECMA 2019 standard, and core-js@3
(babel's library) includes it in their polyfill library)
const arr1 = [1, 2, [3, 4]];
arr1.flat();
// [1, 2, 3, 4]
const arr2 = [1, 2, [3, 4, [5, 6]]];
arr2.flat();
// [1, 2, 3, 4, [5, 6]]
// Flatten 2 levels deep
const arr3 = [2, 2, 5, [5, [5, [6]], 7]];
arr3.flat(2);
// [2, 2, 5, 5, 5, [6], 7];
// Flatten all levels
const arr4 = [2, 2, 5, [5, [5, [6]], 7]];
arr4.flat(Infinity);
// [2, 2, 5, 5, 5, 6, 7];
Most of the answers here don't work on huge (e.g. 200 000 elements) arrays, and even if they do, they're slow.
Here is the fastest solution, which works also on arrays with multiple levels of nesting:
const flatten = function(arr, result = []) {
for (let i = 0, length = arr.length; i < length; i++) {
const value = arr[i];
if (Array.isArray(value)) {
flatten(value, result);
} else {
result.push(value);
}
}
return result;
};
flatten(Array(200000).fill([1]));
It handles huge arrays just fine. On my machine this code takes about 14 ms to execute.
flatten(Array(2).fill(Array(2).fill(Array(2).fill([1]))));
It works with nested arrays. This code produces [1, 1, 1, 1, 1, 1, 1, 1]
.
flatten([1, [1], [[1]]]);
It doesn't have any problems with flattening arrays like this one.
RangeError: Maximum call stack size exceeded
). For 20 000-element array it takes 2-5 milliseconds.
Commented
Jun 18, 2017 at 16:02
Update: it turned out that this solution doesn't work with large arrays. It you're looking for a better, faster solution, check out this answer.
function flatten(arr) {
return [].concat(...arr)
}
Is simply expands arr
and passes it as arguments to concat()
, which merges all the arrays into one. It's equivalent to [].concat.apply([], arr)
.
You can also try this for deep flattening:
function deepFlatten(arr) {
return flatten( // return shalowly flattened array
arr.map(x=> // with each x in array
Array.isArray(x) // is x an array?
? deepFlatten(x) // if yes, return deeply flattened x
: x // if no, return just x
)
)
}
See demo on JSBin.
References for ECMAScript 6 elements used in this answer:
Side note: methods like find()
and arrow functions are not supported by all browsers, but it doesn't mean that you can't use these features right now. Just use Babel — it transforms ES6 code into ES5.
apply
in this way, I removed my comments from yours. I still think using apply
/spread this way is bad advise, but since no one cares...
const flatten = arr => [].concat(...arr)
You can use Underscore:
var x = [[1], [2], [3, 4]];
_.flatten(x); // => [1, 2, 3, 4]
true
for the second argument.
Commented
Jan 18, 2015 at 23:32
Generic procedures mean we don't have to rewrite complexity each time we need to utilize a specific behaviour.
concatMap
(or flatMap
) is exactly what we need in this situation.
// concat :: ([a],[a]) -> [a]
const concat = (xs,ys) =>
xs.concat (ys)
// concatMap :: (a -> [b]) -> [a] -> [b]
const concatMap = f => xs =>
xs.map(f).reduce(concat, [])
// id :: a -> a
const id = x =>
x
// flatten :: [[a]] -> [a]
const flatten =
concatMap (id)
// your sample data
const data =
[["$6"], ["$12"], ["$25"], ["$25"], ["$18"], ["$22"], ["$10"]]
console.log (flatten (data))
And yes, you guessed it correctly, it only flattens one level, which is exactly how it should work
Imagine some data set like this
// Player :: (String, Number) -> Player
const Player = (name,number) =>
[ name, number ]
// team :: ( . Player) -> Team
const Team = (...players) =>
players
// Game :: (Team, Team) -> Game
const Game = (teamA, teamB) =>
[ teamA, teamB ]
// sample data
const teamA =
Team (Player ('bob', 5), Player ('alice', 6))
const teamB =
Team (Player ('ricky', 4), Player ('julian', 2))
const game =
Game (teamA, teamB)
console.log (game)
// [ [ [ 'bob', 5 ], [ 'alice', 6 ] ],
// [ [ 'ricky', 4 ], [ 'julian', 2 ] ] ]
Ok, now say we want to print a roster that shows all the players that will be participating in game
…
const gamePlayers = game =>
flatten (game)
gamePlayers (game)
// => [ [ 'bob', 5 ], [ 'alice', 6 ], [ 'ricky', 4 ], [ 'julian', 2 ] ]
If our flatten
procedure flattened nested arrays too, we'd end up with this garbage result …
const gamePlayers = game =>
badGenericFlatten(game)
gamePlayers (game)
// => [ 'bob', 5, 'alice', 6, 'ricky', 4, 'julian', 2 ]
That's not to say sometimes you don't want to flatten nested arrays, too – only that shouldn't be the default behaviour.
We can make a deepFlatten
procedure with ease …
// concat :: ([a],[a]) -> [a]
const concat = (xs,ys) =>
xs.concat (ys)
// concatMap :: (a -> [b]) -> [a] -> [b]
const concatMap = f => xs =>
xs.map(f).reduce(concat, [])
// id :: a -> a
const id = x =>
x
// flatten :: [[a]] -> [a]
const flatten =
concatMap (id)
// deepFlatten :: [[a]] -> [a]
const deepFlatten =
concatMap (x =>
Array.isArray (x) ? deepFlatten (x) : x)
// your sample data
const data =
[0, [1, [2, [3, [4, 5], 6]]], [7, [8]], 9]
console.log (flatten (data))
// [ 0, 1, [ 2, [ 3, [ 4, 5 ], 6 ] ], 7, [ 8 ], 9 ]
console.log (deepFlatten (data))
// [ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9 ]
There. Now you have a tool for each job – one for squashing one level of nesting, flatten
, and one for obliterating all nesting deepFlatten
.
Maybe you can call it obliterate
or nuke
if you don't like the name deepFlatten
.
Don't iterate twice !
Of course the above implementations are clever and concise, but using a .map
followed by a call to .reduce
means we're actually doing more iterations than necessary
Using a trusty combinator I'm calling mapReduce
helps keep the iterations to a minium; it takes a mapping function m :: a -> b
, a reducing function r :: (b,a) ->b
and returns a new reducing function - this combinator is at the heart of transducers; if you're interested, I've written other answers about them
// mapReduce = (a -> b, (b,a) -> b, (b,a) -> b)
const mapReduce = (m,r) =>
(acc,x) => r (acc, m (x))
// concatMap :: (a -> [b]) -> [a] -> [b]
const concatMap = f => xs =>
xs.reduce (mapReduce (f, concat), [])
// concat :: ([a],[a]) -> [a]
const concat = (xs,ys) =>
xs.concat (ys)
// id :: a -> a
const id = x =>
x
// flatten :: [[a]] -> [a]
const flatten =
concatMap (id)
// deepFlatten :: [[a]] -> [a]
const deepFlatten =
concatMap (x =>
Array.isArray (x) ? deepFlatten (x) : x)
// your sample data
const data =
[ [ [ 1, 2 ],
[ 3, 4 ] ],
[ [ 5, 6 ],
[ 7, 8 ] ] ]
console.log (flatten (data))
// [ [ 1. 2 ], [ 3, 4 ], [ 5, 6 ], [ 7, 8 ] ]
console.log (deepFlatten (data))
// [ 1, 2, 3, 4, 5, 6, 7, 8 ]
concat
itself doesn't blow up the stack, only ...
and apply
does (along with very large arrays). I didn't see it. I just feel terrible right now.
concat
in Javascript has a different meaning than in Haskell. Haskell's concat
([[a]] -> [a]
) would be called flatten
in Javascript and is implemented as foldr (++) []
(Javascript: foldr(concat) ([])
assuming curried functions). Javascript's concat
is a weird append ((++)
in Haskell), which can handle both [a] -> [a] -> [a]
and a -> [a] -> [a]
.
flatMap
, because that is exactly what concatMap
is: The bind
instance of the list
monad. concatpMap
is implemented as foldr ((++) . f) []
. Translated into Javascript: const flatMap = f => foldr(comp(concat) (f)) ([])
. This is of course similar to your implementation without comp
.
To flatten an array of single element arrays, you don't need to import a library, a simple loop is both the simplest and most efficient solution :
for (var i = 0; i < a.length; i++) {
a[i] = a[i][0];
}
To downvoters: please read the question, don't downvote because it doesn't suit your very different problem. This solution is both the fastest and simplest for the asked question.
['foo', ['bar']]
to ['f', 'bar']
.
Commented
Feb 9, 2014 at 0:06
You can also try the new Array.flat()
method. It works in the following manner:
let arr = [["$6"], ["$12"], ["$25"], ["$25"], ["$18"], ["$22"], ["$10"]].flat()
console.log(arr);
The flat()
method creates a new array with all sub-array elements concatenated into it recursively up to the 1 layer of depth (i.e. arrays inside arrays)
If you want to also flatten out 3 dimensional or even higher dimensional arrays you simply call the flat method multiple times. For example (3 dimensions):
let arr = [1,2,[3,4,[5,6]]].flat().flat().flat();
console.log(arr);
Array.flat()
method is relatively new. Older browsers like ie might not have implemented the method. If you want you code to work on all browsers you might have to transpile your JS to an older version. Check for MDN web docs for current browser compatibility.
Infinity
argument. Like this: arr.flat(Infinity)
A solution for the more general case, when you may have some non-array elements in your array.
function flattenArrayOfArrays(a, r){
if(!r){ r = []}
for(var i=0; i<a.length; i++){
if(a[i].constructor == Array){
flattenArrayOfArrays(a[i], r);
}else{
r.push(a[i]);
}
}
return r;
}
flattenArrayOfArrays (arr, 10)
or this flattenArrayOfArrays(arr, [1,[3]]);
- those second arguments are added to the output.
Commented
Aug 7, 2015 at 19:10
Another ECMAScript 6 solution in functional style:
Declare a function:
const flatten = arr => arr.reduce(
(a, b) => a.concat(Array.isArray(b) ? flatten(b) : b), []
);
and use it:
flatten( [1, [2,3], [4,[5,[6]]]] ) // -> [1,2,3,4,5,6]
const flatten = arr => arr.reduce(
(a, b) => a.concat(Array.isArray(b) ? flatten(b) : b), []
);
console.log( flatten([1, [2,3], [4,[5],[6,[7,8,9],10],11],[12],13]) )
Consider also a native function Array.prototype.flat() (proposal for ES6) available in last releases of modern browsers. Thanks to @(Константин Ван) and @(Mark Amery) mentioned it in the comments.
The flat
function has one parameter, specifying the expected depth of array nesting, which equals 1
by default.
[1, 2, [3, 4]].flat(); // -> [1, 2, 3, 4]
[1, 2, [3, 4, [5, 6]]].flat(); // -> [1, 2, 3, 4, [5, 6]]
[1, 2, [3, 4, [5, 6]]].flat(2); // -> [1, 2, 3, 4, 5, 6]
[1, 2, [3, 4, [5, 6]]].flat(Infinity); // -> [1, 2, 3, 4, 5, 6]
let arr = [1, 2, [3, 4]];
console.log( arr.flat() );
arr = [1, 2, [3, 4, [5, 6]]];
console.log( arr.flat() );
console.log( arr.flat(1) );
console.log( arr.flat(2) );
console.log( arr.flat(Infinity) );
RangeError: Maximum call stack size exceeded
Commented
Feb 7, 2018 at 14:31
What about using reduce(callback[, initialValue])
method of JavaScript 1.8
list.reduce((p,n) => p.concat(n),[]);
Would do the job.
You can use Array.flat()
with Infinity
for any depth of nested array.
var arr = [ [1,2,3,4], [1,2,[1,2,3]], [1,2,3,4,5,[1,2,3,4,[1,2,3,4]]], [[1,2,3,4], [1,2,[1,2,3]], [1,2,3,4,5,[1,2,3,4,[1,2,3,4]]]] ];
let flatten = arr.flat(Infinity)
console.log(flatten)
check here for browser compatibility
See lodash flatten, underscore flatten (shallow true
)
function flatten(arr) {
return arr.reduce((acc, e) => acc.concat(e), []);
}
function flatten(arr) {
return [].concat.apply([], arr);
}
Tested with
test('already flatted', () => {
expect(flatten([1, 2, 3, 4, 5])).toEqual([1, 2, 3, 4, 5]);
});
test('flats first level', () => {
expect(flatten([1, [2, [3, [4]], 5]])).toEqual([1, 2, [3, [4]], 5]);
});
See lodash flattenDeep, underscore flatten
function flattenDeep(arr) {
return arr.reduce((acc, e) => Array.isArray(e) ? acc.concat(flattenDeep(e)) : acc.concat(e), []);
}
Tested with
test('already flatted', () => {
expect(flattenDeep([1, 2, 3, 4, 5])).toEqual([1, 2, 3, 4, 5]);
});
test('flats', () => {
expect(flattenDeep([1, [2, [3, [4]], 5]])).toEqual([1, 2, 3, 4, 5]);
});
Array.prototype.concat.apply([], arr)
because you create an extra array just to get to the concat
function. Runtimes may or may not optimize it away when they run it, but accessing the function on the prototype doesn't look any uglier than this already is in any case.
Using the spread operator:
const input = [["$6"], ["$12"], ["$25"], ["$25"], ["$18"], ["$22"], ["$10"]];
const output = [].concat(...input);
console.log(output); // --> ["$6", "$12", "$25", "$25", "$18", "$22", "$10"]
Please note: When Function.prototype.apply
([].concat.apply([], arrays)
) or the spread operator ([].concat(...arrays)
) is used in order to flatten an array, both can cause stack overflows for large arrays, because every argument of a function is stored on the stack.
Here is a stack-safe implementation in functional style that weighs up the most important requirements against one another:
// small, reusable auxiliary functions:
const foldl = f => acc => xs => xs.reduce(uncurry(f), acc); // aka reduce
const uncurry = f => (a, b) => f(a) (b);
const concat = xs => y => xs.concat(y);
// the actual function to flatten an array - a self-explanatory one-line:
const flatten = xs => foldl(concat) ([]) (xs);
// arbitrary array sizes (until the heap blows up :D)
const xs = [[1,2,3],[4,5,6],[7,8,9]];
console.log(flatten(xs));
// Deriving a recursive solution for deeply nested arrays is trivially now
// yet more small, reusable auxiliary functions:
const map = f => xs => xs.map(apply(f));
const apply = f => a => f(a);
const isArray = Array.isArray;
// the derived recursive function:
const flattenr = xs => flatten(map(x => isArray(x) ? flattenr(x) : x) (xs));
const ys = [1,[2,[3,[4,[5],6,],7],8],9];
console.log(flattenr(ys));
As soon as you get used to small arrow functions in curried form, function composition and higher order functions, this code reads like prose. Programming then merely consists of putting together small building blocks that always work as expected, because they don't contain any side effects.
const flatten = (arr) => arr.reduce((a, b) => a.concat(b), []);
saves you visual garbage and explanation to your teammates why you need 3 extra functions and some function calls too.
Commented
Feb 20, 2017 at 16:29
I recommend a space-efficient generator function:
function* flatten(arr) {
if (!Array.isArray(arr)) yield arr;
else for (let el of arr) yield* flatten(el);
}
// Example:
console.log(...flatten([1,[2,[3,[4]]]])); // 1 2 3 4
If desired, create an array of flattened values as follows:
let flattened = [...flatten([1,[2,[3,[4]]]])]; // [1, 2, 3, 4]
...
to iterate through the generator.
Commented
Oct 27, 2018 at 14:49
If you only have arrays with 1 string element:
[["$6"], ["$12"], ["$25"], ["$25"]].join(',').split(',');
will do the job. Bt that specifically matches your code example.
['$4', ["$6"], ["$12"], ["$25"], ["$25", "$33", ['$45']]].join(',').split(',')
[1,4, [45, 't', ['e3', 6]]].toString().split(',')
---- or ----- [1,4, [45, 't', ['e3', 6], false]].toString().split(',')
Commented
Jul 16, 2019 at 14:42
I have done it using recursion and closures
function flatten(arr) {
var temp = [];
function recursiveFlatten(arr) {
for(var i = 0; i < arr.length; i++) {
if(Array.isArray(arr[i])) {
recursiveFlatten(arr[i]);
} else {
temp.push(arr[i]);
}
}
}
recursiveFlatten(arr);
return temp;
}
A Haskellesque approach
function flatArray([x,...xs]){
return x ? [...Array.isArray(x) ? flatArray(x) : [x], ...flatArray(xs)] : [];
}
var na = [[1,2],[3,[4,5]],[6,7,[[[8],9]]],10];
fa = flatArray(na);
console.log(fa);
if you use lodash, you can just use its flatten
method: https://lodash.com/docs/4.17.14#flatten
The nice thing about lodash is that it also has methods to flatten the arrays:
i) recursively: https://lodash.com/docs/4.17.14#flattenDeep
ii) upto n levels of nesting: https://lodash.com/docs/4.17.14#flattenDepth
For example
const _ = require("lodash");
const pancake = _.flatten(array)
ES6 way:
const flatten = arr => arr.reduce((acc, next) => acc.concat(Array.isArray(next) ? flatten(next) : next), [])
const a = [1, [2, [3, [4, [5]]]]]
console.log(flatten(a))
ES5 way for flatten
function with ES3 fallback for N-times nested arrays:
var flatten = (function() {
if (!!Array.prototype.reduce && !!Array.isArray) {
return function(array) {
return array.reduce(function(prev, next) {
return prev.concat(Array.isArray(next) ? flatten(next) : next);
}, []);
};
} else {
return function(array) {
var arr = [];
var i = 0;
var len = array.length;
var target;
for (; i < len; i++) {
target = array[i];
arr = arr.concat(
(Object.prototype.toString.call(target) === '[object Array]') ? flatten(target) : target
);
}
return arr;
};
}
}());
var a = [1, [2, [3, [4, [5]]]]];
console.log(flatten(a));
I was goofing with ES6 Generators the other day and wrote this gist. Which contains...
function flatten(arrayOfArrays=[]){
function* flatgen() {
for( let item of arrayOfArrays ) {
if ( Array.isArray( item )) {
yield* flatten(item)
} else {
yield item
}
}
}
return [...flatgen()];
}
var flatArray = flatten([[1, [4]],[2],[3]]);
console.log(flatArray);
Basically I'm creating a generator that loops over the original input array, if it finds an array it uses the yield* operator in combination with recursion to continually flatten the internal arrays. If the item is not an array it just yields the single item. Then using the ES6 Spread operator (aka splat operator) I flatten out the generator into a new array instance.
I haven't tested the performance of this, but I figure it is a nice simple example of using generators and the yield* operator.
But again, I was just goofing so I'm sure there are more performant ways to do this.
just the best solution without lodash
let flatten = arr => [].concat.apply([], arr.map(item => Array.isArray(item) ? flatten(item) : item))
I would rather transform the whole array, as-is, to a string, but unlike other answers, would do that using JSON.stringify
and not use the toString()
method, which produce an unwanted result.
With that JSON.stringify
output, all that's left is to remove all brackets, wrap the result with start & ending brackets yet again, and serve the result with JSON.parse
which brings the string back to "life".
var arr = ["abc",[[[6]]],["3,4"],"2"];
var s = "[" + JSON.stringify(arr).replace(/\[|]/g,'') +"]";
var flattened = JSON.parse(s);
console.log(flattened)
["345", "2", "3,4", "2"]
instead of separating each of those values to separate indices
"3,4"
.
// using Es6 flat()
let arr = [1,[2,[3,[4,[5,[6,7],8],9],10]]]
console.log(arr.flat(Infinity))
// using Es6 reduce()
let flatIt = (array) => array.reduce(
(x, y) => x.concat(Array.isArray(y) ? flatIt(y) : y), []
)
console.log(flatIt(arr))
// using recursion
function myFlat(array) {
let flat = [].concat(...array);
return flat.some(Array.isArray) ? myFlat(flat) : flat;
}
console.log(myFlat(arr));
// using string manipulation
let strArr = arr.toString().split(',');
for(let i=0;i<strArr.length;i++)
strArr[i]=parseInt(strArr[i]);
console.log(strArr)
I think array.flat(Infinity) is a perfect solution. But flat function is a relatively new function and may not run in older versions of browsers. We can use recursive function for solving this.
const arr = ["A", ["B", [["B11", "B12", ["B131", "B132"]], "B2"]], "C", ["D", "E", "F", ["G", "H", "I"]]]
const flatArray = (arr) => {
const res = []
for (const item of arr) {
if (Array.isArray(item)) {
const subRes = flatArray(item)
res.push(...subRes)
} else {
res.push(item)
}
}
return res
}
console.log(flatArray(arr))
reduce
+concat
are O((N^2)/2) where as a accepted answer (just one call toconcat
) would be at most O(N*2) on a bad browser and O(N) on a good one. Also Denys solution is optimized for the actual question and upto 2x faster than the singleconcat
. For thereduce
folks it's fun to feel cool writing tiny code but for example if the array had 1000 one element subarrays all the reduce+concat solutions would be doing 500500 operations where as the single concat or simple loop would do 1000 operations.array.flat(Infinity)
whereInfinity
is the maximum depth to flatten.