A colleague in the community was reporting on her experience teaching tech for a group of kids, and expressing how tech makes it hard to love tech sometimes, and her post pushed me to think about it. This is the experience a lot of us have.
We want to show a newbie group on how to run with a lovely new tool, only to realize that to run the new tool, we first have to go round whatever troubles come our way. Some of the troubles come from the differences of our environments. Some from the fact that no one updates and maintains some computers. Some from the choices of features (ads in particular) that get in the way. It's not enough if the thing we're building works, but the operating system underneath and even antivirus running in the background are equally part of the users' experience.
Those of us teaching in tech have found workarounds for environmental differences, my favorite being to work on either my computer through ensemble programming, or if I want to teach newbies for real, working on their computers and making time to solve real life problems they will face for things being different.
Quality, while good enough, is currently not great. And it has not only been getting better in the recent years, even if some people speak of AI in terms of it being today the worst it will ever be.
Looking at AI-enabled products, very few of them are really making things better.
I was purchasing a lot of books during my vacation time from Amazon, and seeing the AI-generated summary posts on top of reviews wasn't helping me with my selections. I went for #booktok as AI renders the first recommendation pretty much useless to me - I don't want summary, I want to find someone like me.
Trying out new AI-powered search from Google wasn't producing better results, and I am pretty sure they too know that the AI-powered one is doing worse than the classic one.
Apple just announced introducing - finally - some AI-powered features with chatGPT integration, looking like a placeholder to grow worthwhile features in, but other than that giving appearance of little usefulness to the product.
Everyone seeks uses of AI. Experiments with AI. I do too. I would still report than in the last two years of actively having used AI-tools, it has not added to my productivity, but it has added sometimes to my quality (I'm really good at not being happy with AI's results) and it has added to my fun and enjoyment.
Current uses of AI are driving down experience we have on good enough quality. To include a placeholder for AI to grow - assuming it is only getting better - we make quite significant compromises in the value our products and services provide. The goal for AI-placeholders is not to make things better today. It is to make space to make things better soon, and meanwhile we may take a dip in the experience of relevance for users.
For years, I have come to note that truly seeking good quality is not common. Our words to explain the level we are targeting or achieving are hidden behind the fuzzy term "quality". Even bad quality is quality. Making things fun and entertaining is quality.
Our real target may be usefulness and productivity. Being able to do things with tech we weren't able to do before, and being able to do things we really need to do. While we find our way there with GenAI, we're going to experience a dip on good enough quality to allow for large scale experimentation on us.