c++ - Efficiently processing large number of unique elements (std::set vs other containers) -



c++ - Efficiently processing large number of unique elements (std::set vs other containers) -

i have std::set having big number unique objects elements.

in main thread of program:

i take objects set assign info processed each of them remove objects set and pass objects threads in threadpool processing once threads finish processing objects, adds them set. (so in next iteration, main thread can 1 time again assigns next batch of info objects processing)

this arrangement works perfect. if encounter error while adding object set (for example, std::set.insert() throws bad_alloc) goes on toss. if ignore error , proceed, there no way object in processing set , remains out of programme flow forever causing memory leaks.

to address issue tried not remove object set. instead, have fellow member flag indicates object 'being processed'. in case problem is, main thread encounters 'being processed' objects 1 time again , 1 time again while iterating through elements of set. , badly hampers performance (number of objects in set quite large).

what improve alternatives here?

can std::list used instead of std::set? list not have bad_alloc problem while adding element, needs assign pointers while adding element list. how can create list elements unique? if @ accomplish it, efficient std::set?

instead of removing , adding elements std::set, there way move element start or end of set? unprocessed objects , processed accumulate towards start , end of set.

any other solution please?

c++ c++11 stl stdset

Comments

Popular posts from this blog

Delphi change the assembly code of a running process -

json - Hibernate and Jackson (java.lang.IllegalStateException: Cannot call sendError() after the response has been committed) -

C++ 11 "class" keyword -