Revisiting AI Project Cycle with Data Science

The Scenario:

1. Problem Scopping

Let us look at various factors around this problem through the 4Ws problem canvas.
Who Canvas – Who has the problem? or Who are the stakeholders?
o People who suffer from stress and are at the onset of depression.

What Canvas – What is the nature of the problem?
o People who need help are reluctant to consult a psychiatrist and hence live miserably.

Where Canvas – Where does the problem arise? or What is the context/situation in which the stakeholders experience this problem?
o When they are going through a stressful period of time due to some unpleasant experiences.

Why Canvas – Why do you think it is a problem worth solving and How would it improve their situation?
o People get a platform where they can talk and vent out their feelings anonymously.
o People get a medium that can interact with them and applies primitive CBT on them and can suggest help whenever needed.
o People would be able to vent out their stress.

Now, we have gone through all the factors around the problem, the Problem Statement Templates go as follows:

Our People undergoing stress. Who?
Have a problem of Not being able to share their feelings. What?
While They need help in venting out their emotions. Where?
An ideal solution would Provide them a platform to share their thoughts anonymously and suggest help whenever required. Why?

This leads us to the goal of our project which is:
“To create a chatbot which can interact with people, help them to vent out their feelings and take them through primitive CBT.”

2. Data Acquisition

To understand the sentiments of people, we need to collect their conversational data so the machine can interpret the words that they use and understand their meaning. Such data can be collected from various means:
1. Surveys
2. Observing the therapist’s sessions
3. Databases available on the internet
4. Interviews, etc.


3. Data Exploration

Once the data has been collected, it needs to be processed and cleaned. Thus, the text is normalised through various steps and is lowered to minimum vocabulary since the machine does not require grammatically correct statements but the essence of it.


4. Modelling

Once the text has been normalised, it is then fed to an NLP based AI model. Depending upon the type of chatbot we try to make, there are a lot of AI models available which help us build the foundation of our project.


5. Evaluation

The model trained is then evaluated and the accuracy for the same is generated on the basis of the relevance of the answers which the machine gives to the user’s responses.