Grace (2023) explained that AI can give a personalized experience to users.
The previous study about Castro et al. (2020) and the use of apps to support the ageing community needs this functionality.
The study comprised of participants with differing levels of physical activity and experience with technology. A one size fits all approach to technology does not meet all user needs. AI can possible help bridge this gap.
Grace (2023) outlined the following areas AI can impact regarding personalized fitness apps. The following areas also impact some of the proposed functionalities by the participants in the previous study discussed.
Using personal data points like age, gender, fitness level, exercise history, and health metrics can help create a personalized workout plan and a possible nutrional plan (Grace, 2023).
One of the concerns of the participants from the Castro et al. (2020) study was injury prevention. AI technology using motion tracking can offer form and technique correction.
The participants in the Castro et al. (2020) study wanted more motivational messages and a live avatar to provide encouragement. AI in conjunction with natural language processing can understand user questions, provide instructions, and answer questions in real time (Grace, 2023). Hiring personal trainers can be costly. This can be an alternative option to keep fit and learn about fitness without a live instructor.
The insufficiently active and technologically unengaged user in the Castro et al. (2020) study would benefit greatly from this functionality. AI can help provide reminders and nudges to help the user stay on track.
Although AI boasts enhanced device functionality, it has ongoing issues with data collection. What happens to all the data collected in these apps? Is it ethical to sell them to analytic companies like Google or Facebook?
This issue is explored in the Hadley - Burke (2023) article about data collection in health care apps and the role of AI. The article raised 4 pertinent points which are listed below.
According to one of the lead researchers interviewed for the article, the platforms only collected names, email addresses, and phone numbers for internal marketing. The IP addresses and cookie histories were also shared with Facebook or Google.
Hadley - Burke (2023) reported that this is still harmful to some degree because these platforms provide one type of service and it can show a great deal of insight to the kinds of concerns patients have.
Hadley - Burke (2023) also explained that once information is de-identified in Canada, an entity can do whatever they seem fit with the information.
The article also stated that if these pieces of information is fed into automatic systems, social bias can be incorporated.
Human intervention is still needed to weed out social biases present in these automatic decision making systems. A diverse data set is needed to train these systems to help combat social bias.
Hadley - Burke (2023) interviewed a medical doctor and they explained that the rate of data collection can help with sharing information on how to diagnose and treat illnesses. This could also promote a higher standard of care because of the data provided.
The doctor interviewed left the readers a question that I think is applicable to this project: Can machine learning help us understand our individual bodies and tailor diagnoses and therapies to individual bodies?
In the spirit of this project, tailoring fitness activities could be included in the question.
Any emerging piece or functionality of technology would always have disadvantages to offer. Privacy issues will be an ongoing facet of AI technology as companies and organizations (private or government) have the interest to control or monetize the information collected from the users. I think the issue with privacy concerns involves questioning one's ethics and morality. Does the end truly justify the means? Does monetizing user information contribute to building a better society? As AI technology becomes more sophisticated, people will pressure government entities for more regulations. We as users would also know where to position ourselves in relation to AI.
Historically, people have been averse when it comes to new technology. For example, actors were concerned about the emergence of TV as they want to be properly compensated for the amount of time their movies play on the television (Palmer & Ogunbiyi, 2023). As a result, the royalty system was further improved (Palmer & Ogunbiyi, 2023). I think this would happen with AI eventually. Arguably, attitude towards AI is changing as the current climate of AI is now of acceptance and exploration. When used properly, it could actually help automate systems and would free up our time to do more creative endeavours and aid with learning.
The next section summarizes the project in relation to one of the course themes: What does it mean to be an exemplary model of mobile/open learning?
Click on the conclusion button below to continue reading.