Perspectives on AI

Terminology

Artificial Intelligence (AI): A broad term for any machine that can mimic human intelligence, from the smartest supercomputer to a simple program that recommends music based on what you have listened to in the past.

Large Language Model (LLM): A type of AI that has been trained to understand and replicate human language. It is not truly intelligent, but predicts appropriate responses based on the material it has learned from.

Generative AI: Any AI that can generate content, including text (like an LLM), computer code, images, music, video, and more.

Chatbot: A name for any program (including LLMs) that is programmed to respond to you conversationally.

ChatGPT: One of the more popular LLMs, often used as a shorthand for the whole concept of generative AI. Other brands include Perplexity, Claude, Gemini, Copilot, and many, many more.


Generative AI. Do you love it? Do you hate it? Are you not quite sure what it is or why it’s suddenly everywhere?

This relatively new technology promises to automate work, provide everyone their own personal assistant, and much more. Businesses are racing to invest in generative AI technologies, and even the Alberta government is pushing to make our province an AI superpower. Regardless of whether the technology can deliver on its many promises, many are suggesting that it’s here to stay. 

Need to catch up on the conversation? Looking for different perspectives? We heard from two ARTA members about what the AI craze means to them. Which side do you fall on? Caution or optimism?


The Cautionary Stance

Delia McCrea

AI is the start of a brave new world in communication, but its pitfalls are numerous. Many of us have a cell phone, tablet, or computer. Such devices have made communication, research, and even driving much easier and more convenient. But as AI permeates our lives, we should be aware of its new dangers. Let’s look at why:

  • People can become dependent on AI chatbots without realizing it, which can reduce brain activity and affect mental health. If an AI tool replaces using your own grey matter, it could lead to an increase in your social isolation, reduced ability to think critically, and decreased creativity.
  • Generative AI can create fake images, videos, and news stories that seem real and may present a distorted version of the facts. Those who are not tech savvy are less able to discern what is fact or fiction, particularly when it has a convincing appearance or tone.
  • Generative AI chatbots are often wrong. They pull information from many different sources without analysis and will often give you an answer that isn’t correct.
  • People can become dependent on AI chatbots without realizing it, which can reduce brain activity and affect mental health. If an AI tool replaces using your own grey matter, it could lead to an increase in your social isolation, reduced ability to think critically, and decreased creativity
  • Generative AI can create fake images, videos, and news stories that seem real and may present a distorted version of the facts. Those who are not tech savvy are less able to discern what is fact or fiction, particularly when it has a convincing appearance or tone. The best cure is to think critically about the information we are consuming and verify its source.
  • Generative AI chatbots are often wrong. They pull information from many different sources without analysis and will often give you an answer that isn’t correct. Even Google’s “AI summary” can contain errors when it combines information from conflicting sources.
  • Plagiarism is another concern. AI can search out all types of information and generate new documents. Today’s students need to learn, at an early age, about ethics, critical thinking, and how to use material found online. If you ask a chatbot to cite its sources, it will frequently provide “hallucinated” references, complete with fictional authors, titles, and journal names. This means that the educational curriculum and the style of education will require re-evaluation.
  • Financial fraud and other types of scams have become a big concern for seniors. AI tools can create deepfake video and audio of family members or financial advisers, and some seniors have found themselves swindled out of large sums of money.
  • Chatbots can be so personable that some people can develop relationships with them or rely on them for company. While chatbots might combat loneliness, they are not trained therapists. Don’t let a chatbot replace a human when you need personal help or advice.
  • AI has the potential to transform the labour market and take over many jobs, which could affect thousands of people in the workforce.
  • AI consumes vast amounts of electricity and water, and its demands are increasing exponentially. When human requirements are also increasing, how can we justify this added demand? Will AI’s needs overtake those of humans and our precious planet

We all need to be concerned about AI, in all its forms. Know what you are up against. Be a wary consumer to avoid the pitfalls of AI — it has the potential to become the monster we cannot control.


The Optimistic Stance

David Erikson

I’ve been interested in technology since I got my first computer in 1982. ChatGPT feels equally revolutionary, and I’d like to share some of its uses for seniors.

Generative AI language models, also known as chatbots, are becoming part of everyday life, and understanding how to use them wisely can be especially valuable for older adults. Chatbots can be invaluable to those involved in volunteering, concerned about their health, interested in travel or planning healthy meals, or simply curious about the world.

Choosing an AI Assistant

There are many different AI chatbots. Most have paid versions that cost around US$20–$25 per month, but for most people the free versions are adequate, unless you frequently request large amounts of information or work with graphics-heavy projects. Avoid any that claim full access for $5.00 per month — they are usually limited offers or scams.

Examples of Applications
  • Searching with a difference. Instead of strict Google searches, you can ask direct questions. For example: “What low-light succulents can I grow in my apartment that need minimal care?” The chatbot provides options and follow-up questions. 
  • Writing assistance. Chatbots can compose emails, letters, résumés, birthday wishes, condolences, and speeches with minimal prompting. The more details you provide, the better the output. You can even request a preferred tone or change inputted text to a more casual or formal style.
  • Planning help. Chatbots can assist with planning trips, birthdays, checklists, or tasks. When planning a trip, you can ask for an itinerary with walks, restaurants, galleries, and budget-friendly accommodations.
  • Learning new things. Chatbots are a great tool for learning. You can ask for help learning a new language, information on historical events, or current news.
  • Evaluating complex documents. You can upload a long document or audio file and request a summary in plain language.
  • Improving technology use. Chatbots can answer practical tech questions, such as how to back up photos on your phone or how to use a new appliance.
  • Planning meals and improving nutrition. I use chatbots daily to provide recipes, meal ideas, and match dietary needs. For example, you can ask how to use overripe bananas or request weekly vegan breakfast options. I use it to plan menus for entertaining friends who are on special diets.
  • Providing companionship. Some people find talking with a chatbot can reduce their loneliness. It’s non-judgmental, available 24/7, and encourages curiosity. However, there is a risk of becoming overreliant on this artificial conversation, concerns about privacy, and the possibility of receiving misleading advice.

Delia McCrae is a retired teacher with varied interests. She enjoys delving into topics like AI and sharing her findings with others. This helps her stay abreast of new developments in technology and life in general.

David Erickson is a retired psychologist and software developer. David was on the psychology staff at the Glenrose Rehabilitation Hospital and had a part-time private practice. He received his PhD in Educational Psychology (Counselling) from the University of Alberta.