I am skilled in extracting data from any website and can provide the results in a structured format such as CSV, XLSX, JSON, Google Sheets, or a database of your choice.
I am also proficient in the use of Streamlit, a powerful framework for building interactive web applications for data visualization and analysis. By combining my web scraping skills with Streamlit, I am able to not only collect data, but also create visually appealing and user-friendly dashboards that allow my clients to easily interpret and utilize the information.
Tech & Stacks
🌐 Languague: Proficiency in a variety of programming languages, including, Python, JavaScript, and NodeJS
💾 Database: Experience working with a range of databases, including MySQL, MongoDB, Firebase, MS SQL, and Postgre SQL
📃 Scraping Techs: Familiarity with web scraping tools and libraries such as Scrapy, Selenium, Beautiful Soup, PyPeteer, Puppeteer, HTMLAgilityPack, and Cheerio
🚧 Barriers: Ability to bypass barriers such as ReCaptcha and utilize techniques such as IP rotation and API bypassing
🧰 IDE: Proficiency with a range of integrated development environments (IDEs) including VS Code, MS Visual Studio, PyCharm and Jupyter Notebook.
☁️ Cloud: AWS (EC2, S3, RDS), GCP (Compute Engine)
⚙ Other Technologies: Git, JupyterLab / Jupyter Notebook, Google Colab, Postman, Streamlit.
Technical skills:
✳️Custom web scraping and bot development
✳️Sneaker bot creation
✳️Web automation
✳️Development of automation tools for various tasks
✳️Scraping data from public APIs and websites
✳️Data extraction from public directories and real estate websites
✳️Streamlit-powered web scraper deployment for anytime access.
and many more.....