Skip to content

nwizugbesamson/linkedin

 
 

Repository files navigation

built with Python3 built with Selenium

LinkedIn Automation Tool

This tool is designed for automating tasks on LinkedIn. It leverages technologies such as Scrapy, Selenium WebDriver, Chromium (in headless mode), Docker, and Python3.

Sponsor:

Proxycurl APIs enrich people and company profiles with structured data

Scrape public LinkedIn people and company profile data at scale with Proxycurl APIs.

  • Scraping Public profiles are battle tested in court in HiQ VS LinkedIn case
  • GDPR, CCPA, SOC2 compliant
  • High rate limit - 300 requests/minute
  • Fast - APIs respond in ~2s
  • Fresh data - 88% of data is scraped real-time, other 12% are not older than 29 days
  • High accuracy
  • Tons of data points returned per profile

Built for developers, by developers.

Features

LinkedIn Spider

The LinkedIn Spider is designed to visit as many LinkedIn user pages as possible. The goal is to increase the visibility of your account, as LinkedIn notifies users when their profile has been viewed.

Companies Spider

The Companies Spider is designed to gather information about all users working for a specific company on LinkedIn. It operates by:

  1. Navigating to the company's LinkedIn page.
  2. Clicking on the "See all employees" button.
  3. Collecting user-related data.

Installation and Setup

You will need the following:

  • Docker
  • Docker Compose
  • A VNC viewer (e.g., Vinagre for Ubuntu)

Steps

  1. Prepare your environment: Install Docker from the official website. If you don't have a VNC viewer, install one. For Ubuntu, you can use Vinagre:
sudo apt-get update
sudo apt-get install vinagre
  1. Set up LinkedIn login and password: Copy conf_template.py to conf.py and fill in your LinkedIn credentials.

  2. Run and build containers with Docker Compose: Open your terminal, navigate to the project folder, and type:

make companies
or
make random
or
make byname
  1. Monitor the browser's activity: Open Vinagre and connect to localhost:5900. The password is secret. Alternatively, you can use the command:
make view
  1. Stop the scraper: To stop the scraper, use the command:
make down

Testing

make test

Legal Disclaimer

This code is not affiliated with, authorized, maintained, sponsored, or endorsed by LinkedIn or any of its affiliates or subsidiaries. This is an independent and unofficial project. Use at your own risk.

This project violates LinkedIn's User Agreement Section 8.2. As a result, LinkedIn may temporarily or permanently ban your account. We are not responsible for any actions taken by LinkedIn in response to the use of this tool.


About

Linkedin Scraper using Selenium Web Driver, Chromium headless, Docker and Scrapy

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 93.3%
  • Makefile 2.7%
  • Dockerfile 2.6%
  • Shell 1.4%