Concise overview of things learned:
Scrapped Amazon seller discourse forum – Web driver & Beautiful Soup
Data Cleaning and Wrangling – Python
Build baseline model for comparison
Learned and explored BERT, Attention and Seq2Seq model
Built BERT and executed two different versions of it
Explored Different Classification Techniques - Logistic Regression, KNN, Naive Bayes, Decision, Tree Random Forest
Improving Accuracy by removing stopwords and other techniques
AWS hosting and deployment
Divided entire team into three parts to work together – depending on expertise and skills
Explaining team the project and the tasks to be completed each week
Providing team with directions and resources to learn ML models and Python
Providing office hours each week to clear any doubts and errors faced by participants
Playing team building exercise to connect the entire team
Assigning tasks to Tech Lead and project Manager for the upcoming meeting
Conducting meeting twice a week and ask for project progress and discuss further steps.
Meetings and trainings attended:
Attended all team meetings (meetings conducted atleast once a week and sometimes twice a week)
Attended or watched recordings of all industry training sessions (attended almost all the webinars conducted by Colin and also the ml_teamleads meeting)
Detail statement of tasks done:
Week 1: Started with Introduction and connected with all the participants and leads. Helped with setting up the account on Asana , Gsuite and Slack. Played a game for team building and know each other better. Asked participants to fill form to know each ones technical expertise. Explained the workflow. Started by assigning the sub teams who will work together and on what part. Provided with python resources and ML Coursera course. Explained the project – the end product of classifying the similar posts together by using ML models.
Week 2: Started with lot of questions of web scraping. I started the meeting with explaining how a Data Analysis or Machine Learning model should be built. Assigned Tech Lead to explain the overview of web scraping is done. Provided web scraping materials to the team. Helped with technical difficulties.
Week 3: Explained how to work with Git and Github. Showed them practically to upload merge and create branches in Github and work with the team. Next, started with discussing the project progress and explaining the data formatting, cleaning and processing to the team and introduced BERT, Attention and Seq2Seq model.
Week 4: Solved difficulties in code and helped with their doubts in office hours. Explained versions of BERT to the team and experimented with data by combining data of different subteams.
Week 5: Explained how to improve the accuracy. Conducted office hour before the final presentations of the team, so they can clear all the questions and error they are facing. Provided classification materials to know more about its type and implementation. Let the team present their project work.