Is there a straightforward way to assign a pandas dataframe to a multiprocessing SharedMemory buffer? Converting the dataframe to numpy arrays first or pickling it is not an option. Also, the dataframe is in the order of gigabytes.
Asked
Active
Viewed 553 times
1
-
This question is a replication of [multiprocessing in python - sharing large object (e.g. pandas dataframe) between multiple processes](https://stackoverflow.com/questions/22487296/multiprocessing-in-python-sharing-large-object-e-g-pandas-dataframe-between) – yakutsa Jan 21 '22 at 00:06