Apple is unexceptional. It’s just another Big Tech monopolist. Rounded corners don’t preserve virtue any better than square ones. Any company that is freed from constraints – of competition, regulation and interoperability – will always enshittify. Apple – being unexceptional – is no exception.
Tag: Apple
What I wish would happen — and yes, I know this is naive and stupid and probably fruitless — is that Apple would just give the slightest bit of ground. Yes, the company has the right to earn a profit from its IP, and yes, it created the market that developers want to take advantage of, and yes, the new generation of creators experimenting with new kinds of monetization only make sense in an iPhone world, but must Apple claim it all?
Let developers own their apps, including telling users about their websites, and let creatives build relationships with their fans instead of intermediating everything. And, for what it’s worth, continue controlling games: I do think the App Store is a safer model, particularly for kids, and the fact of the matter is that consoles have the same rules. The entire economy, though, is more than a game, and the real position of strength is unlocking the full potential of the iPhone ecosystem, instead of demanding every dime, deadweight loss be damned.
As everyone with any urge to read this far likely knows, the 1980s were a very important time in the history of computing. IBM’s PC was released in 1981, setting the standard for personal computing for decades to come. The Apple Lisa in 1983 presaged the Mac and the whole revolution of the windows-icons-mouse graphical user interface that would dominate computing to come.
Acorn saw these developments happening and realized they would need something more powerful than the aging but reliable 6502 to power their future machines if they wanted to compete. Acorn had been experimenting with a lot of 16-bit CPUs: the 65816, the 16-bit variant of the 6502, the Motorola 68000 that powered the Apple Macintosh, and the comparatively rare National Semiconductor 32016.
None of these were really doing the job, though, and Acorn reached out to Intel to see about implementing the Intel 80286 CPUs into their new architecture.
But whether we trust Apple might be beside the point, if we don’t yet know whether we can trust ourselves. It took eight years from the launch of the iPhone for screen time controls to follow. What will human interaction look like eight years after smartglasses become ubiquitous? Our cyborg present sneaked up on us as our phones became glued to our hands. Are we going to sleepwalk into our cyborg future in the same way?
This lawsuit is also a reminder that Apple has a lot to lose. While the most likely outcome is an Apple victory — the Supreme Court has been pretty consistent in holding that companies do not have a “duty to deal” — every decision the company makes that favors only itself, and not society generally, is an invitation to examine just how important the iPhone is to, well, everything.
Indeed, this is the most frustrating aspect of this debate: Apple consistently acts like a company peeved it is not getting its fair share, somehow ignoring the fact it is worth nearly $2 trillion precisely because the iPhone matters more than anything. This is not a console you play on to entertain yourself, or even a PC for work: it is the foundation of modern life, which makes it all the more disappointing that Apple seems to care more about its short term bottom line than it does about the users and developers that used to share in its integration upside; if Apple doesn’t change course, hyperessential will at some point trump hypercompetitive.
Fortnite is free, but users can pay for in-game items like weapons and skins through its direct payment option.
Epic said the system was the same payment system it already uses to process payments on PC and Mac computers and Android phones.
Apple takes a cut of between 15-30 per cent for most app subscriptions and payments made inside apps.
Thompson breaks down the three aspects of the app store’s integration as a core installation, for payment processing and customer management. Although this provides many benefits to users in regards to trust and efficency. It also creates problems for cross-platform developers who have to make various adjustments to accommodate Apple.
I have long believed that the Internet is going to fundamentally remake all aspects of society, including the economy, and that one area of immense promise is small-scale entrepreneurship. The App Store was, at least at the beginning, a wonderful example of this promise; as Jobs noted even the smallest developer could reach every iPhone on earth. Unfortunately, without even a whiff of competition, the App Store has now become a burden for most small developers, who instead of relying on the end-to-end functionality offered by, say, Stripe, have to support at least two payment solutions, the combined functionality of which is limited to the lowest common denominator, i.e. the App Store.
Continuing down this path, Thompson argues that at some point hyperessential will trump hypercompetitive. He elaborates on some of his suggests in a follow up pieces.
Alex Hern also discussed this topic in his newsletter:
Time and again over the last six months, Apple has revealed that it truly believes that it is entitled to a cut of all commerce that occurs on an iPhone. It has said as much to developers, as it rejects their apps while noting that they made a lot of money without paying anything to Apple.
I don’t think that Apple is entitled to that. I don’t think that Apple is entitled to anything other than the money – the vast amount of money – that I have paid it to buy an iPhone in the first place. If it wants to make more money after that, it can try and sell me more products. But if I want to ignore it, I’m going to.
Only four people at Apple knew about this secret project. Me, the director of iPod Software, the vice president of the iPod Division, and the senior vice president of Hardware. None of us still work at Apple. There was no paper trail. All communication was in person.
If you asked Apple about the custom iPod project and got past the stock “No comment,” the PR people would tell you honestly that Apple has no record of any such project.
But now you know.
Anyone can learn to code on iPad or Mac with these 10 activities designed for beginners ages 10 and up.
If you want to live in the creative universe where anyone with a cool idea can make it and give it to you to run on your hardware, the iPad isn’t for you.
If you want to live in the fair world where you get to keep (or give away) the stuff you buy, the iPad isn’t for you.
If you want to write code for a platform where the only thing that determines whether you’re going to succeed with it is whether your audience loves it, the iPad isn’t for you.
Buying an iPad for your kids isn’t a means of jump-starting the realization that the world is yours to take apart and reassemble; it’s a way of telling your offspring that even changing the batteries is something you have to leave to the professionals.
Along with John Gruber’s post, these pieces provide a useful provocation to reflect upon.
The iPad at 10 is, to me, a grave disappointment. Not because it’s “bad”, because it’s not bad — it’s great even — but because great though it is in so many ways, overall it has fallen so far short of the grand potential it showed on day one. To reach that potential, Apple needs to recognize they have made profound conceptual mistakes in the iPad user interface, mistakes that need to be scrapped and replaced, not polished and refined. I worry that iPadOS 13 suggests the opposite — that Apple is steering the iPad full speed ahead down a blind alley.
When Apple revealed the iPhone in late 2007, with its touch screen and several native apps, some technology writers dubbed it the “Jesus Phone”. It has sold more than 1.5bn units.
This makes the iPod the musical and technological equivalent of John the Baptist. The device was quickly superseded, but it prepared the way for the great innovations to come. It showed consumers that technology could be beautiful and that the most enthralling possessions could fit in the palm of a hand.
The company has made it extremely difficult to use web-based technology on its platforms, and it hopes developers won’t bother
The evolution of the Apple keynote is understandable. Apple is a global company that changed computing by putting little ones in all our pockets. Their new phones are big deals by virtue of the fact that they’ve sold more than 2.2 billion iOS devices since their debut in 2007. iPhones changed how we communicate with one another and seek information; they’ve addicted us, tethering us to our jobs and helping us feel both attached to and alienated from one another. So it makes sense that we pay attention when the company dreams up a new iteration. Plus, they’re exceedingly shiny and the cameras can turn any point-and-click amateur taking photos of their goofy dog (me!) into Annie Leibovitz.
Apple claims that AirPods are building a “wireless future.” Many people think they’re a symbol of disposable wealth. The truth is bleaker.
Eighteen years after its launch, iTunes is going the way of the 8-track—but we’ll never forget the joy of compiling our first digital music libraries
Really, in a lot of ways, the iPad Mini feels like the one true iPad, and the others are all just blown-up siblings that don’t quite know how to take advantage of their larger displays.
How IBM bet big on the microkernel being the next big thing in operating systems back in the ’90s—and spent billions with little to show for it.
Traditional maps are half shapes, half labels—but satellite and AR maps drop the shapes, and keep just the labels. And this spells trouble for Apple… Remember what we saw earlier: Apple is making lots of shapes out of its imagery. But Apple doesn’t appear to making labels out its imagery. Nor does Apple appear to be making labels out of its shapes.
📓 Seams vs. Stitches
Seamlessness isn’t pretty; it’s opaque and obscures the underlying structures of the tool you are making.
A stitch or a seam isn’t ugly; it’s an affordance that exposes the design, construction, and make of what you’ve made in a way that lends itself to learning.
Beauty and uniformity are two entirely independent characteristics. Seamlessness can look ugly and stitches can be pretty.
Good design can only be seamless when it has just one job to do. Add more jobs and seamlessness becomes a hindrance.source