ruby on rails 3 - How to handle concurrent requests that delete and create the same rows? -
i have table looks following:
game_stats table:
id | game_id | player_id | stats | (many other cols...) ---------------------- 1 | 'game_abc' | 8 | 'r r b s' | ... 2 | 'game_abc' | 9 | 's b s' | ...
a user uploads data given game in bulk, submitting both players' data @ once. example:
"game": { id: 'game_abc', player_stats: { 8: { stats: 'r r b s' }, 9: { stats: 's b s' } } }
submitting server should result in first table.
instead of updating existing rows when same data submitted again (with revisions, example) in controller first delete existing rows in game_stats
table have given game_id:
class gamestatcontroller def update gamestat.where("game_id = ?", game_id).destroy_all params[:game][:player_stats].each |stats| game_stat.save end end end
this works fine single threaded or single process server. problem i'm running unicorn, multi-process server. if 2 requests come in @ same time, race condition:
request 1: gamestat.where(...).destroy_all request 2: gamestat.where(...).destroy_all request 1: save new game_stats request 2: save new game_stats
result: multiple game_stat rows same data.
i believe somehow locking rows or table way go prevent multiple updates @ same time - can't figure out how it. combining transaction seems right thing do, don't understand why.
edit
to clarify why can't figure out how use locking: can't lock single row @ time, since row deleted , not modified.
ar doesn't support table-level locking default. you'll have either execute db specific sql or use gem monogamy
wrapping save statements in transaction speed things if nothing else.
another alternative implement lock redis. gems redis-lock available. less risky doesn't touch db, , can set redis keys expire.