I am trying to write the data to CSV. In the terminal, I am getting all the data that I need. However, in the CSV I am getting just half of the results.
Here is my code -
`for i in range(1, 101):
print(i)
driver.get("url")
time.sleep(2)
# Scrolling down to the page
total_height = int(driver.execute_script(
"return document.body.scrollHeight"))
for i in range(1, total_height, 5):
driver.execute_script("window.scrollTo(0, {});".format(i))
time.sleep(3)
# Getting the name of the employee
Employee_Name = driver.find_elements_by_xpath(
"//*[@class='app-aware-link']/span[1]")
for name in Employee_Name:
Employee_Names.append(name.text)
print(Employee_Names)
# Getting the employee linkedin URL
Employee_A = driver.find_elements_by_xpath(
"//*[@class='entity-result__title-text t-16']/a")
for link in Employee_A:
Employee_URLS.append(link.get_attribute("href"))
print(Employee_URLS)
# Getting the current position and company
Current_Position = driver.find_elements_by_xpath(
"//*[@class='entity-result__primary-subtitle t-14 t-black']")
for position in Current_Position:
Current_Positions.append(position.text)
print(Current_Positions)
# Getting the current headline
Current_Position_Headline = driver.find_elements_by_xpath(
"//*[@class='entity-result__summary t-12 t-black--light mb1']")
for c_position in Current_Position_Headline:
Current_Position_Headlines.append(c_position.text)
print(Current_Position_Headlines)
# Writing the results to CSV
with open('linkedinsearch.csv', 'w') as f:
writer = csv.writer(f)
writer.writerows(zip(Employee_Names, Employee_URLS,
Current_Positions, Current_Position_Headlines))
driver.implicitly_wait(5)
print("going to the next page")
time.sleep(3)
print("Request successful")
`
Can someone suggest me how do I get all the results that I am getting in terminal.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…