I am trying to find the length of dask dataframe using len(dataframe[column])
but everytime i try to execute this i get an error:
distributed.nanny - WARNING - Worker exceeded 95% memory budget. Restarting
distributed.nanny - WARNING - Restarting worker
Traceback (most recent call last):
File "C:UsershaknehAppDataLocalContinuumanaconda3libmultiprocessingqueues.py", line 238, in _feed
send_bytes(obj)
File "C:UsershaknehAppDataLocalContinuumanaconda3libmultiprocessingconnection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "C:UsershaknehAppDataLocalContinuumanaconda3libmultiprocessingconnection.py", line 280, in _send_bytes
ov, err = _winapi.WriteFile(self._handle, buf, overlapped=True)
BrokenPipeError: [WinError 232] The pipe is being closed
distributed.nanny - ERROR - Nanny failed to start process
Traceback (most recent call last):
File "C:UsershaknehAppDataLocalContinuumanaconda3libsite-packagesdistributed
anny.py", line 575, in start
await self.process.start()
File "C:UsershaknehAppDataLocalContinuumanaconda3libsite-packagesdistributedprocess.py", line 34, in _call_and_set_future
res = func(*args, **kwargs)
File "C:UsershaknehAppDataLocalContinuumanaconda3libsite-packagesdistributedprocess.py", line 202, in _start
process.start()
File "C:UsershaknehAppDataLocalContinuumanaconda3libmultiprocessingprocess.py", line 112, in start
self._popen = self._Popen(self)
File "C:UsershaknehAppDataLocalContinuumanaconda3libmultiprocessingcontext.py", line 223, in _Popen
return _default_context.get_context().Process._Popen(process_obj)
File "C:UsershaknehAppDataLocalContinuumanaconda3libmultiprocessingcontext.py", line 322, in _Popen
return Popen(process_obj)
File "C:UsershaknehAppDataLocalContinuumanaconda3libmultiprocessingpopen_spawn_win32.py", line 89, in __init__
reduction.dump(process_obj, to_child)
File "C:UsershaknehAppDataLocalContinuumanaconda3libmultiprocessing
eduction.py", line 60, in dump
ForkingPickler(file, protocol).dump(obj)
File "C:UsershaknehAppDataLocalContinuumanaconda3libmultiprocessingconnection.py", line 948, in reduce_pipe_connection
dh = reduction.DupHandle(conn.fileno(), access)
File "C:UsershaknehAppDataLocalContinuumanaconda3libmultiprocessingconnection.py", line 170, in fileno
self._check_closed()
File "C:UsershaknehAppDataLocalContinuumanaconda3libmultiprocessingconnection.py", line 136, in _check_closed
raise OSError("handle is closed")
OSError: handle is closed
distributed.nanny - WARNING - Worker exceeded 95% memory budget. Restarting
distributed.nanny - WARNING - Worker exceeded 95% memory budget. Restarting
distributed.nanny - WARNING - Worker exceeded 95% memory budget. Restarting
My dask dataframe has got 10 million rows. Is there any way i can get through this error.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…