Sunday, 29 April 2018 - 2:15pm
This week, I have been mostly reading:
- Video Game “Loot Boxes” Are Like Gambling for Kids — and Lawmakers Are Circling — Zaid Jilani at the Intercept:
In mid-November, video game publisher Electronic Arts released “Star Wars Battlefront II,” a multiplayer shooter for consoles and PCs. The title is likely to be a top item on many holiday shoppers’ lists; the original “Battlefront” sold an estimated 12 million copies. But “Battlefront II,” rated for ages 13 and up, has ignited a firestorm of controversy for the particularly cynical way it pushes players to buy “loot boxes,” random collections of in-game abilities that remain a mystery until purchased. Experts say loot boxes prey on addictive impulses that can be particularly difficult for children and other young people to control. Lawmakers, meanwhile, are considering regulating loot boxes as a form of gambling.
- Bizarro! — by Dan Piraro:
- The Illusionist — David Bentley Hart in the New Atlantis reads Dan Dennett's latest, so that we don't have to:
The idea that the mind is software is a fairly popular delusion at the moment, but that hardly excuses a putatively serious philosopher for perpetuating it — though admittedly Dennett does so in a distinctive way. Usually, when confronted by the computational model of mind, it is enough to point out that what minds do is precisely everything that computers do not do, and therein lies much of a computer’s usefulness. Really, it would be no less apt to describe the mind as a kind of abacus. In the physical functions of a computer, there is neither a semantics nor a syntax of meaning. There is nothing resembling thought at all. There is no intentionality, or anything remotely analogous to intentionality or even to the illusion of intentionality. There is a binary system of notation that subserves a considerable number of intrinsically mindless functions. And, when computers are in operation, they are guided by the mental intentions of their programmers and users, and they provide an instrumentality by which one intending mind can transcribe meanings into traces, and another can translate those traces into meaning again. But the same is true of books when they are “in operation.” And this is why I spoke above of a “Narcissan fallacy”: computers are such wonderfully complicated and versatile abacuses that our own intentional activity, when reflected in their functions, seems at times to take on the haunting appearance of another autonomous rational intellect, just there on the other side of the screen. It is a bewitching illusion, but an illusion all the same. And this would usually suffice as an objection to any given computational model of mind. But, curiously enough, in Dennett’s case it does not, because to a very large degree he would freely grant that computers only appear to be conscious agents. The perversity of his argument, notoriously, is that he believes the same to be true of us.
- Poverty eliminated — This Modern World by Tom Tomorrow: