The Tyranny of the Algo

Evil tech

As we’ve argued previously, here, here, and here, algorithms are no magic wand for sorting subjective content, artistic or otherwise. Here the top brass at Apple admits to the fact in a criticism of one of its competitors, Spotify.

Two take-aways from Mr. Cook’s argument. One, he claims that Apple uses the human creativity of its users to create playlists, but tuka uses it’s entire network of users to reward human curation of all content. Yeah, web 3.0 is about the human, not the machine.

Second, the race among the streamers, Apple Music, Amazon Music, Google Play, and Spotify, Pandora, etc. is another loss-leading attempt to build out a user base to monetize the data flow. In other words, streaming doesn’t pay unless you can monetize the network in some other way. As I suspected, the pricing model of streaming is likely financially unsustainable in the long-run as the true price of streaming content is several times what consumers are now paying. Direct ownership of content may be far more economical than renting it without ad support.

Apple’s CEO Says Spotify Is ‘Draining the Humanity Out of Music’

Apple CEO Tim Cook recently revealed the company’s streaming music service, Apple Music, has clearly surpassed Spotify in subscribers in the US, Canada, and Spotify.

Instead of gloating, he humbly downplayed the achievement.

“The key thing in music is not the competition between the companies that are providing music, the real challenge is to grow the market.  If we put our emphasis on growing the market, which we’re doing, we’ll be the beneficiaries of that, as will others.”

In a recently published article with monthly business magazine Fast Company, Cook revealed his true feelings about his streaming music rivals.  Taking a clear swipe at Spotify, he said,

“We worry about the humanity being drained out of music, about it becoming a bits-and-bytes kind of world instead of the art and craft.”

The Creators Case for Blockchain

Social Media Connection

Nice article on Medium:

A Poet’s Case for Blockchain

I would add that the major problems for artists in the digital age stem from the explosion of new supply of content. This drives the price down and the search costs of discovery up. The failure then becomes that artists can’t find their audiences and consumers can’t find the content they desire. For poets this means finding an audience not necessarily to sell poetry; rather more important is to find readers and appreciators of their poetry.

Large centralized network servers based on algorithms can’t solve this problem without commoditizing content and delivering the most popular but mundane content churned out by those metrics.

We need to empower the human by connecting the creative.

OSN Heart

 

No, Nobody is Saving the Music Industry. Or the Culture Industry.

No, Streaming Services Are Not ‘Saving The Music Industry’

Recent reports that streaming is now the ‘biggest money-maker’ for the music biz have prompted hyperbolic claims that Spotify and co have ‘saved the music industry’. In reality, this could not be further from the truth.

Excerpt:

Progressive music that goes against the aesthetics of whatever the mainstream might be at any given point by its very nature does not cater to the whims of a Spotify algorithm. Now that streaming is the industry’s biggest money-maker it has become the overriding force in music consumption. This dominance will only increase as time goes on, and for artists to gain anything, as a result, requires them to conform or die. There are exceptions, most notably in zeitgeist-seizing movements like grime that are both artistically essential and buoyed by the kind of mass appeal that in effect bypasses the need for a leg-up from the algorithms, but such a lethal combination is rare indeed. Not everything that is great is as popular.

Yes, not everything that is great is popular, and not everything that is popular is great! We need human subjectivity, and that’s more complicated than a complex algorithm.

If streaming platforms keep growing more and more influence over how music is curated and marketed by those in charge, while the revenue for those not mundane enough to fit their algorithms remains so pitifully minute, it is not that impossible to envisage the blandest landscape the industry has ever seen. Great music will continue being made, of course, but getting that music out to people outside of the algorithms will be so much harder. “I hope I am wrong,” says Reeder. “I hope the revenue from streaming does improve, because if it doesn’t, well, who knows how positive the future will be for the majority of music makers and labels out there?”

This is not only the case for music, but for literature, poetry, video, and photography.

Woe is Facebook.

Some more bad news for Facebook. The following two news bites explain why. At a deeper level, we must ask why Facebook users are disengaging. I suspect it’s due to two factors: one, FB’s expressed desire to please users by emphasizing friends and family in timeline feeds; and two, because most of FB’s timeline content outside of friends and family is fairly tedious and distracting. Both factors greatly reduce the ad pushing revenue model that FB depends on. One could feel this at least a year or more ago when ads grew like kudzu on our timeline feeds.

The solution is to de-emphasize the monetization of meaningless connections and focus on meaningful connections.  That can only come through peer-to-peer sharing of meaningful content. Stories is one way to engage our creativity. Many others exist. See tuka.

The One Graph That Shows (Again) That Facebook’s Epic Run Has Ended

Facebook had a terrible, horrible, no good, very bad day.

After the company’s quarterly earnings call with investors, FB’s stock price dropped ~20% in after-hours trading. Over $100B in value disappeared in an instant after FB announced disappointing revenue numbers and user growth.

Some context: that’s comparable to the entirety of General Motors, Ford and Target… combined. ?

Why did the stock tank? A perfect storm hit one of Facebook’s core features, the News Feed:

  • Less “viral” clickbait in the feed. Facebook has committed to optimizing for “time well-spent” in the app, not overall engagement. While this shift made for a better experience for users, FB can’t show users as many ads as before.
  • Less feed personalization. In the wake of the Cambridge Analytica scandal and recent GDPR regulations, Facebook revamped its data usage policies and privacy controls. These changes hurt FB’s ability to charge companies big bucks to specifically target ads.

But the most interesting change to the News Feed: the rollout of stories, a well-received but not-well-monetized feature that could change how most people use Facebook and their products altogether.

Stories might actually break Facebook

Stories – tappable, full-screen photos and videos – are replacing news feeds everywhere. The format was originally pioneered by Snapchat. In fact, right after Snapchat launched stories in 2013, Facebook tried to acquire them for $3B. ?

Facebook worked around the failed acquisition by copying Stories inside Instagram (and then in Messenger… and then in Facebook itself).

It worked:


Instagram stories took off, with 250M+ users engaging with the feature less than a year after its launch. That’s over 50% of Instagram users… and close to double the number of Snapchat daily active users. ?

Unfortunately, the success of Stories might shoot Facebook’s ad revenues in the foot. Companies are still figuring out how to build the ad units – engaging, vertical video ads that users won’t immediately tap past. Rest assured that #content creators everywhere are working to make Story ads profitable. [Blogger note: interesting that people still think content is only worth what it can shake from the money tree (i.e., commercial advertising).]

Anti-Culture by Algorithm

As we’ve mentioned elsewhere, rule by algorithm can be just as stringent as any rule by a dictator, perhaps even more so as it is vague, faceless, and hard to define. And human actors will always adopt perverse incentives to game the algorithm. This article explains how Amazon Kindle algorithms can produce perverse incentives that determine market outcomes for books and the products we get to see and consume.
Let’s just say, it ain’t literature! 

BAD ROMANCE

To cash in on Kindle Unlimited, a cabal of authors gamed Amazon’s algorithm

A genre that mostly features shiny, shirtless men on its covers and sells ebooks for 99 cents a pop might seem unserious. But at stake are revenues sometimes amounting to a million dollars a year, with some authors easily netting six figures a month. The top authors can drop $50,000 on a single ad campaign that will keep them in the charts — and see a worthwhile return on that investment.

In other words, self-published romance is no joke.

Book stuffing is a term that encompasses a wide range of methods for taking advantage of the Kindle Unlimited revenue structure. In Kindle Unlimited, readers pay $9.99 a month to read as many books as they want that are available through the KU program. This includes both popular mainstream titles like the Harry Potter series and self-published romances put out by authors like Crescent and Hopkins. Authors are paid according to pages read, creating incentives to produce massively inflated and strangely structured books. The more pages Amazon thinks have been read, the more money an author receives.

“But if they’re only relying on algorithms, they’re going to find — like YouTube and like Facebook have been finding — they need human curators as well.”

Exactly.

Read more…

 

Networks and Hierarchies

This is a review of British historian Niall Ferguson’s new book titled The Square and the Tower: Networks, Hierarchies and the Struggle for Global Power. It’s interesting to take the long arc of history into account in this day and age of global communication networks, which might seem to herald the permanent dominance of networks over hierarchies. That history cautions us otherwise.

Ferguson notes two predominant ages of networks: the advent of the printing press in 1452 that led to an explosion of networks across the world until around 1800. This was the Enlightenment period that helped transform economics, politics, and social relations.

Today, the second age of networks consumes us, starting at about 1970 with microchip technology and continuing forward to the present. It is the age of telecommunications, digital technology, and global networks. Ours is an age where it seems “everything is connected.”

Ferguson notes that, beginning with the invention of written language,  all that has happened is that new technologies have facilitated our innate, ancient urge to network – in other words, to connect. This seems to affirm Aristotle’s observation that “man is a social animal,” as well as a large library of psychological behavioral studies over the past century. He also notes that most networks may reflect a power law distribution and be scale-free. In other words, large networks grow larger and become more valuable as they do so. This means the rich get richer and most social networks are profoundly inegalitarian. This implies that the GoogleAmazonFacebookApple (GAFA) oligarchy may be taking over the world, leaving the rest of us as powerless as feudal serfs.

But there is a fatal weakness inherent to this futuristic scenario, in that complex networks create interdependent relationships that can lead to catastrophic cascades, such as the global financial crisis of 2008. Or an explosion of “fake news” and misinformation spewed out by global gossip networks.

We are also seeing a gradual deconstruction of networks that compete with the power of nation-state sovereignty. This is reflected in the rise of nationalistic politics in democracies and authoritarian monopoly control over information in autocracies.

However, from the angle of hierarchical control, Ferguson notes that failures of democratic governance through the administrative state “represents the last iteration of political hierarchy: a system that spews out rules, generates complexity, and undermines both prosperity and stability.”

These historical paths imply that the conflict between distributed networks and concentrated hierarchies is likely a natural tension in search of an uneasy equilibrium.

Ferguson notes “if Facebook initially satisfied the human need to gossip, it was Twitter – founded in March 2006 – that satisfied the more specific need to exchange news, often (though not always) political.” But when I read Twitter feeds I’m thinking Twitter may be more of a tool for disruption rather than constructive dialogue. In other words, we can use these networking technologies to tear things down, but not so much to build them back up again.

As a Twitter co-founder confesses:

‘I thought once everybody could speak freely and exchange information and ideas, the world is automatically going to be a better place,’ said Evan Williams, one of the co-founders of Twitter in May 2017. ‘I was wrong about that.’

Rather, as Ferguson asserts, “The lesson of history is that trusting in networks to run the world is a recipe for anarchy: at best, power ends up in the hands of the Illuminati, but more likely it ends up in the hands of the Jacobins.”

Ferguson is quite pessimistic about today’s dominance of networks, with one slim ray of hope. As he writes,

“…how can an urbanized, technologically advanced society avoid disaster when its social consequences are profoundly inegalitarian?

“To put the question more simply: can a networked world have order? As we have seen, some say that it can. In the light of historical experience, I very much doubt it.”

That slim ray of hope? Blockchain technology!

A thought-provoking book.

 

 

 

Algorithmic Culture

Some excerpts from this article point out the effect machine algorithms have on shaping our information, our entertainment, and our culture.

The Creepy and Creeping Power of Social Media

By Ned Ryun| June 8th, 2018

While algorithms are necessary to serve up the content people want, social media companies failing to be transparent on this front are dangerous…Algorithm tweaking isn’t neutral and it has a massive “follow on” effect in the digital industry and political world, changing the kind of content that people see everyday. So if the algorithm starts filtering [say, content] it puts a thumb on the scale, favoring one side over the other. With a small handful of controllers over the algorithms, it’s appropriate to ask who controls the controllers?

….

We should acknowledge that rule by algorithm can be just as stringent as any rule by a dictator, perhaps even more so as it is vague, faceless, and hard to define. These algorithms decide what you see and don’t see in your timeline, subtly determining for you what is “worthy” of your attention. Facebook treats this algorithm like a black box, we’re never allowed to look inside and see what’s going on, we’ll only ever see the results on our news feeds. A world ruled by algorithms—just like the one it replaced controlled by network executives—closes off views, closes off debates, and further Balkanizes people. So, in fact, how can these social media and tech giants save democracy when in fact they’re becoming less democratic?

This is the same dynamic that is filtering and feeding our artistic content through the world-wide web. We can only consume the content that we can find and this is how it’s being found.

Put The Damn Phone Down and Do Something

This is a good interview with the Ben Silbermann, founder of Pinterest, published on Medium.

Some excerpts:

We’re social creatures. We need to connect with other people.

Pinterest is actually…it’s really about you. It’s about your tastes, your aspirations, your plans. There are other people there. Our recommendations are all curated by other users. The objective is not to do that [seek Likes]. That’s why it’s different than social networks.

sure, it’s fun to look at millions of ideas, but eventually, the real satisfaction and joy comes from giving it a shot. It might turn out great. It might turn out poorly. All of that is fine. We want to be the company that motivates you to put your phone down and to go try those things.

So, Pinterest is doing the right things to encourage engagement within the community. The next step of Web 3.0 is to distribute the network value they create back to the community of users. tuka will do that.

Surviving the Digital Economy

This will be a crucial issue for a free society going forward…giving away your data is like giving away your labor.

Want Our Personal Data? Pay for It

The posting, tagging and uploading that we do online may be fun, but it’s labor too, and we should be compensated for it

By Eric A. Posner and Glen Weyl

WSJ, April 20, 2018 11:19 a.m. ET

Congress has stepped up talk of new privacy regulations in the wake of the scandal involving Cambridge Analytica, which improperly gained access to the data of as many as 87 million Facebook users. Even Facebook chief executive Mark Zuckerberg testified that he thought new federal rules were “inevitable.” But to understand what regulation is appropriate, we need to understand the source of the problem: the absence of a real market in data, with true property rights for data creators. Once that market is in place, implementing privacy protections will be easy.

We often think of ourselves as consumers of Facebook, Google, Instagram and other internet services. In reality, we are also their suppliers—or more accurately, their workers. When we post and label photos on Facebook or Instagram, use Google maps while driving, chat in multiple languages on Skype or upload videos to YouTube, we are generating data about human behavior that the companies then feed into machine-learning programs.

These programs use our personal data to learn patterns that allow them to imitate human behavior and understanding. With that information, computers can recognize images, translate languages, help viewers choose among shows and offer the speediest route to the mall. Companies such as Facebook, Google, and Microsoft (where one of us works) sell these tools to other companies. They also use our data to match advertisers with consumers.

Defenders of the current system often say that we don’t give away our personal data for free. Rather, we’re paid in the form of the services that we receive. But this exchange is bad for users, bad for society and probably not ideal even for the tech companies. In a real market, consumers would have far more power over the exchange: Here’s my data. What are you willing to pay for it?

An internet user today probably would earn only a few hundred dollars a year if companies paid for data. But that amount could grow substantially in the coming years. If the economic reach of AI systems continues to expand—into drafting legal contracts, diagnosing diseases, performing surgery, making investments, driving trucks, managing businesses—they will need vast amounts of data to function.

And if these systems displace human jobs, people will have plenty of time to supply that data. Tech executives fearful that AI will cause mass unemployment have advocated a universal basic income funded by increased taxes. But the pressure for such policies would abate if users were simply compensated for their data.

The data currently compiled by Facebook and other companies is of pretty low quality. That’s why Facebook has an additional army of paid workers who are given dedicated tasks, such as labeling photos, to fill in the gaps left by users. If Facebook paid users for their work, it could offer pay tied to the value of the user’s contribution—offering more, for example, for useful translations of the latest Chinese slang into English than for yet another video labeled “cat.”

So why doesn’t Facebook already offer wages to users? For one, obviously, it would cost a lot to pay users for the data that the company currently gets for free. And then Google and others might start paying as well. Competition for users would improve the quality of data but eat away at the tech companies’ bottom line.

It’s also true that users simply aren’t thinking this way. But that can change. The basic idea is straightforward enough: When we supply our personal data to Facebook, Google or other companies, it is a form of labor, and we should be compensated for it. It may be enjoyable work, but it’s work just the same.

If companies reject this model of “data as labor,” market pressure could be used to persuade them. Rather than sign up directly with, say, Facebook, people would sign up with a data agent. (Such services, sometimes referred to as personal data exchanges or vaults, are already in development, with more than a dozen startups vying to fill this role.) The data agent would then offer Facebook access to its members and negotiate wages and terms of use on their behalf. Users would get to Facebook through the agent’s platform. If at any time Facebook refused reasonable wages, the data agent could coordinate a strike or a boycott. Unlike individual users, the data agent could employ lawyers to review terms and conditions and ensure that those terms are being upheld.

With multiple data agents competing for users’ business, no one could become an abusive monopolist. The agent’s sole purpose would be managing workers’ data in their interests—and if there were a problem, users could move their data to another service without having to give up on their social network.

Companies such as Apple and Amazon also could get into the act. Currently, their business models are very different from those of Facebook and Google. For the most part, their focus is on selling products and services, rather than offering them without charge. If Facebook and Google refuse to pay users for their data, these other companies are big and sophisticated enough to pay for data instead.

Would the “data as labor” model put the tech giants out of business? Hardly. Their vast profits already reflect their monopoly power. Their margins would certainly be tighter under this new regime, but the wider economy would likely grow through greater productivity and a fairer distribution of income. The big companies would take a smaller share of a larger pie, but their business model would be far more sustainable, politically and socially. More important, they would have to focus on the value that their core services bring to consumers, rather than on exploiting their monopoly in user data.

As for Congress, it could help by making it simpler for individuals to have clear property rights in their own data, rights that can’t be permanently signed away by accepting a company’s confusing terms and conditions. The European Union has already taken steps in this direction, and its new regulations—which require data to be easily portable—are a leading stimulus for the rise of data agent startups. Government can also help by updating labor law to be more consistent with modern data work while protecting data workers from exploitation.

Most of us already take great satisfaction in using social media to connect with our friends and family. Imagine how much happier and prouder we would be if we received fair pay for the valuable work we perform in doing that?

Prof. Posner teaches at the University of Chicago Law School. Dr. Weyl teaches at Yale University and is a principal researcher at Microsoft (whose views he in no way represents here). Their new book is “Radical Markets: Uprooting Capitalism and Democracy for a Just Society,” which will be published on May 8 by Princeton University Press.

https://www.wsj.com/articles/want-our-personal-data-pay-for-it-1524237577 (Paywall)