Scraping software aimed to visit as more linkedin's user pages as possible :-D, the objective is to gain visibility with your account: since LinkedIn notifies the issued User when someone visits his page.
Uses: Scrapy, Selenium web driver, Chromium headless, docker and python3.
Docker allows very easy and fast run without any pain and tears.
Install docker from the official website https://www.docker.com/
Install VNC viewer if you do not have one. For ubuntu, go for vinagre:
sudo apt-get update
sudo apt-get install vinagreThen connect to localhost:5900, password: secret
Open conf.py and fill the quotes with your credentials.
First you need to open your terminal, move to the root folder (usually with the cd command) of the project and then type:
docker-compose up -d --buildRun your VNC viewer, and type address and port localhost:5900. The password is secret.
Use your terminal again, type in the same window:
docker-compose downCreate the selenium server:
docker run --name selenium -p 4444:4444 -p 5900:5900 --publish-all --shm-size="128M" selenium/standalone-chrome-debugvirtualenvs -p python .venv
source .venv/bin/activate
pip install -r requirements.txt
python -m unittest test.py
Stop and delete selenium server:
docker stop $(docker ps -aq --filter name=selenium)
docker rm $(docker ps -aq --filter name=selenium)