Array of hashes to hash
Solution 1
You may use
a.reduce Hash.new, :merge
which directly yields
{:a=>:b, :c=>:d}
Note that in case of collisions the order is important. Latter hashes override previous mappings, see e.g.:
[{a: :b}, {c: :d}, {e: :f, a: :g}].reduce Hash.new, :merge # {:a=>:g, :c=>:d, :e=>:f}
Solution 2
You can use .inject
:
a.inject(:merge)
#=> {:a=>:b, :c=>:d}
Which initiates a new hash on each iteration from the two merged. To avoid this, you can use destructive :merge!
( or :update
, which is the same):
a.inject(:merge!)
#=> {:a=>:b, :c=>:d}
Solution 3
These two are equivalent:
total_hash = hs.reduce({}) { |acc_hash, hash| acc_hash.merge(hash) }
total_hash = hs.reduce({}, :merge)
Note that Hash#merge
creates a new hash on each iteration, which may be a problem if you are building a big one. In that case, use update
instead:
total_hash = hs.reduce({}, :update)
You can also convert the hashes to pairs and then build the final hash:
total_hash = hs.flat_map(&:to_a).to_h
Solution 4
I came across this answer and I wanted to compare the two options in terms of performance to see which one is better:
a.reduce Hash.new, :merge
a.inject(:merge)
using the ruby benchmark module, it turns out that option (2) a.inject(:merge)
is faster.
code used for comparison:
require 'benchmark'
input = [{b: "c"}, {e: "f"}, {h: "i"}, {k: "l"}]
n = 50_000
Benchmark.bm do |benchmark|
benchmark.report("reduce") do
n.times do
input.reduce Hash.new, :merge
end
end
benchmark.report("inject") do
n.times do
input.inject(:merge)
end
end
end
the results were
user system total real
reduce 0.125098 0.003690 0.128788 ( 0.129617)
inject 0.078262 0.001439 0.079701 ( 0.080383)
Solution 5
Just use
a.reduce(:merge)
#=> {:a=>:b, :c=>:d}
![evfwcqcg](https://i.stack.imgur.com/HWrIJ.jpg?s=256&g=1)
evfwcqcg
Updated on July 05, 2022Comments
-
evfwcqcg almost 2 years
For example, I have array of single hashes
a = [{a: :b}, {c: :d}]
What is best way to convert it into this?
{a: :b, c: :d}
-
tokland about 12 years
Hash.new
, or as friends like to call him,{}
:-) So much as I like pure functional solution, note thatmerge
will create a new hash on every iteration; we can useupdate
instead (it won't mess up with the input hashes, that's the important point):hs.reduce({}, :update)
-
Jason over 9 years@tokland, post your comment as a separate answer - it should get more visibility
-
Paul Danelli over 5 yearsThats crazy elegant. Thank you.
-
Greg Tarsa about 5 yearsThis result confused me. The docs say
reduce
andinject
are aliased. A quick check w/ your test shows the slowdown is due toHash.new
as the initializer.:merge
creates a new hash each iteration.:update
doesn't. So, a re-run with:update
shows, even with theHash.new
, the:update
version is faster:``` user system total real reduce w/ Hash.new & :update 0.056754 0.002097 0.058851 ( 0.059330) reduce w/ :merge only 0.090021 0.001081 0.091102 ( 0.091257)``` -
Greg Tarsa about 5 yearsIf your application allows for it, the
:update
version suggested by tokland is the faster option.