Tuesday, May 5, 2020

Algorithmically Generated Consent

I’ve started listening to the algorithmically generated playlist that Spotify makes for me every week -- I’m actually listening to it as I write this -- and it’s been fun! It’s a lot of music that I’ve never heard before mixed in with some that I had, but only through other sources. It’s never anything I’d really go out of my way to look for on my own and that’s a cool thing.

And yet, it worries me a little, because while I do like most of the songs that have been thrown my way, it either means I’ve let the algorithm collect enough information on me to make some pretty accurate guesses or it means I have no taste and will like so many things. I’m actually a little partial to the second reason; it means unless a song is mastered poorly, I can just let it slide into the background. But the first fear is still right there and a bit more worrisome.

One of my resolutions for this and every year was to be “out there” on the internet a bit more, and that’s meant opening myself up to these sorts of metadata collection schemes. I’m sure Google knows all about me, and not just because Blogspot is a Google-owned property. Mostly I hope to just be ignored to an extent, with billions upon billions of people using the internet, the fact that I am probably just a bit of data amongst all that is actually a little comforting to me. But that doesn’t mean I don’t notice when personalized content gets thrown at me.

Youtube’s algorithm is a bit weird, for example. I don’t have the space nor the ability to go into exact details, but the things it tends towards more “controversial” (especially politics-wise) content being rewarded, especially at a quick output. So on completely unrelated videos, I’ll see something with a headline trying to be eye-catching and a little “Recommended for you” tag under it and all I can do is say “Huh.”

Maybe that’s comforting too, that they still guess wrong. Not all the songs on Spotify’s playlists are ones I like, though I’m sure they notice when I skip over them. I do think the black-box nature of the algorithm is scary, of course, but it also makes it difficult to do more than just gesture at it.

A bit of a disclaimer, though. There are algorithms that do more than just recommend media to people or sell ads based on interests. Credit scores, for example, have started determining more than just if someone can pay back a loan, which I think is harmful and dangerous for the same reasons I find the former ones ineffectually weird at times. They could get things wrong. But also for the same reasons, all I can do, from this blog, at least, is mention it.

-F

No comments:

Post a Comment