Skip to content Skip to sidebar Skip to footer

Efficient Data Structure For Storing N Lists Where N Is Very Large

I will need to store N lists, where N is large (1 million). For example, [2,3] [4,5,6] ... [4,5,6,7] Each item is a list of about 0-10000 elements. I wanted to use a numpy array

Solution 1:

Maybe use a dictionary:

d={}
for i inrange(N):
  d[i]=your_nth_list

And you will simply append them by:

d[k].append(additional_items)

(It's efficient for 10.000.000 lists of 1000 items each)

Solution 2:

Unless the elements youre storing follow some pattern you must use nested list since there is no other way to get those elements out of the others.

In Python:

listOfLists = [[1,2,3],
               [4,5,6],
               [7,8,9]]

So whenever you want to operate with this list you can use numpy functions

>>>np.mean(listOfLists)
5.0
>>>np.max(listOfLists)
9

Solution 3:

try nested list

nestedList = [[2,3],[4,5,6]]

Solution 4:

You could use nested lists but they are not efficent in terms of complexity. In fact, it is linear, you could use dictionaries to get better results :

dict={}
for i inrange(numer_of_lists) :
  dict[str(i)]=your_i-th_list

Then access the i-th element withdict[str(i)] Then, appening an element will be as easy

`

Post a Comment for "Efficient Data Structure For Storing N Lists Where N Is Very Large"