So, aside from the battery issue -- we achieved a 1% charge after sitting for about 48 hours untouched, hooray -- the MacBook Pro seems to be perfectly okay. I still feel like I'm smashing all the keys Shrek-style, and persistently reach for a delete-things-to-the-right button (and Home/End/PgUp/PgDn...) that isn't there, but that'll go away eventually. The touch bar is pretty neat, although wholly unnecessary, and my brain keeps telling me that virtual Esc key is a colossal single-point-failure waiting to take out the entire keyboard. How am I supposed to communicate my displeasure to the computer if I can't bang on Esc/Break/three-finger salute it when it hangs? Oh well.
Step One when I get a new computer is always to factory reset the OS (or install another one, if it's completely hosed) and then wade through like twelve hours of updates and rebooting. Macs have a hardware (firmware?) restore mode, and will happily nuke the boot drive for you if you're sure you want to start over. It's a pleasant change from all the Windows computers that get converted to Linux because the restore partition is hosed and I don't have any system discs, assuming the thing came with physical media in the first place. I have never actually paid for a standalone bootable install of Windows, and I'm not about to start.
Step Two is to use whatever its default browser is to download Chrome. Sorry, Safari/Edge/Firefox/Silk, but I kind of hate you, and you never talk to your siblings anyway.
I've long since switched over to using open-source software for all my office and media stuff. The sensible reason for this is that I have a huge stack of random devices, all running slightly different OSes. The current accumulation is an Acer Aspire running Windows 10, the MacBook that just updated itself to Monterey, a stack of older things that run Linux (Ubuntu 20.04 LTS, except for an ancient Chromebook that's been converted to AntiX, a stripped-down distro based on Debian), an Android phone that runs Lollipop, and a Kindle Fire that might have gotten up to Nougat. And I've probably forgotten something. They all live in different places and get used for different things. The various big open-source projects all have a branch for each of Windows, Mac, and Linux, and they've taken quite a bit of trouble to make sure that the UI is pretty much identical on all of them.
Chrome is especially bae because all of the Chrome/Chromium installs talk to all of the other ones, and keep a shared History in the cloud. If I need to switch gadgets for some reason, I can just throw my current tabs at whatever I'm about to lug around and not lose track of everything I'm doing. I'm sure other browsers have sync features now, but they're nowhere near as consistent across platforms, particularly when switching to and from mobile. Chrome is also BFFs with the Chromecast for obvious reasons, and Chrome Remote Desktop is surprisingly useful if you're me and too lazy to get up and physically poke the trackpad on the computer running the TV/external monitor.
The petty reason for converting to OpenOffice et al is that I was working in the university computer labs when Microsoft overhauled the UI on Word. I spent four solid months telling people, "Okay, click on the menu in the upper left. No, the menu. The menu. The big decorative button thing. No, I promise you, that's a menu. I agree, that is stupid." I will never forgive Microsoft for not just insisting on fixing a thing that wasn't particularly broken, but fixing it so badly that they would have received millions of strongly-worded letters of complaint about it, had anyone been able to figure out how to fucking print them.
Switching between operating systems is nowhere near as onerous as it once was. The differences between Windows 10 and Monterey, at the UX level, are minuscule compared to the gulf between, say, MS-DOS 6.22/Win 3.11 and those little toaster Mac 128Ks. If you dig around under the hood enough, the difference between the new Mac and my Ubuntu computers is even smaller -- OS X and Linux distros are all ultimately descended from the original Bell Labs UNIX in one way or another, as opposed to the Windows environment that grew upwards from various 8-bit home microcomputers.
The operating metaphors have more or less standardized across platforms. Everybody gives you a "desktop" workspace, on which actionable items are represented as "icons". Discrete units of information are represented as "files", which are organized into nested "file folders". There's a strip of desktop reserved for a customizable line of icons representing the things you do the most. Things you do less often are accessible through a more elaborate, searchable menu system. Stuff in general is located by following a decision tree from most general category to most specific until you reach a pointer unambiguously describing the item you want. You select things using a graphical cursor controlled by moving a pointing device around a representation of the desktop space, that responds to a "primary click" and a "secondary click". Windows and Linux still use the legacy designations of "left/right click", a remnant of the two-button mouse, but pretty much everyone has decided that the touchpads really want "one finger/two finger taps" and touchscreens want "tap/hold". We've almost but not quite settled on what happens when you make pinchy gestures or use three fingers, and whether two-finger scroll moves the content or the window is at least selectable in the control panel.
Really, at the casual user level, questions are along the lines of "how do I get to that thing I just installed?" and "how do I change the desktop wallpaper?" You can rummage around in the innards of the OS by opening a terminal window, but the average user wouldn't know how and would probably respond with panic if you tried to explain it. I have no qualms about typing gibberish like 'sudo apt-get update' to make something behave, but I also remember how to write *.BAT files in DOS -- I'm not the average user, I'm the person the average user hands their computer to when it's broken.
I will note that MacOS still puts a lot more effort into dazzling people who have no idea how computers work than Windows does. The wallpapers that change with the time of day are indeed really neat-o, but it's not a Mac exclusive. Windows has also been able to do that, and a variety of other scripted tricks, since they introduced Active Desktop. It's just turned on by default on the Mac, where in Windows you have to specifically look and possibly download an external widget to do it. There's also no reason you can't have a Retina display on a Windows computer. This MacBook is based on the Intel Core i7, which runs 64-bit Windows perfectly well, and will correctly report and pull current Windows drivers for the fancy screen. The Touch Bar is literally just a strip of the same kind of induction touchscreen that you have on your phone, which is run by the OS and applications the same way games run the resistive touchscreen on your Nintendo DS/3DS.
Even the icons are more artistic than utilitarian. The Settings gear has a load of much finer teeth than you see on the generic icon everyone else uses, presumably to show off the sharp details on the screen. The Preview icon is of a photo print with a slim white border and a small loupe magnifier. It's recognizable to anyone who has ever hand-retouched a photograph, but meaningless and puzzling to anyone else. (Judging from the reddit questions, most people think it looks like a salt shaker.) The System Information app brings up a set of calipers measuring a DIP chip, where the top of the calipers intentionally breaks out of the bounds of the icon box. Which is rather interesting, since the Finder icon does not break out, even though it's a throwback to the traditional Happy Mac face whose nose-line customarily juts off the top and bottom of the rectangle.
The upshot of this is that a lot of what you pay for when you shell out for a Mac is the Apple design language. If that amount of pretty is worth the money to you, then go for it. The reason I don't care much for Apple products is not the design, which is not a thing I'm dying to have but also not objectionable, but that they also make it as difficult as possible to repair anything, thus forcing you to either pay Apple prices for repair or Apple prices for replacements. I see no problem with upcharging for UX look and feel, if that's something your users value, but I'm decidedly not cool with trying to monopolize repair and replacement so you can extort whatever amount of money you want from people who have already paid outright for their hardware. They would still get plenty of business from people who value convenience and had the cash to buy a Mac in the first place. This just screws over people like students who were given a MacBook for school and are stuck between a rock and a hard place if it breaks.
Out-of-warranty battery replacement on a MacBook is $199. This is unrelated to labor; it's Apple's price across the board for any battery replacement on anything that has run out of Apple Care. A corporate tech who does the battery replacements himself for insurance reasons has informed me that $199 is also the price of just the battery if you buy OEM from Apple, presumably to punish your apostasy. A third-party battery, which may or may not be recognized depending on how fussy your MacBook is, runs $80-100.
MacBooks under five or six years old are closed with pentalobe security screws. You can't patent something that simple, so it is possible to get screwdrivers in that shape, but they are an Apple specific thing, so you won't find them outside of a kit meant for Mac repairs. The beginner iFixit kit is $25 or so.
The batteries are glued in, which is an unmitigated dick move by Apple. You do not need to glue batteries in. Nobody glues batteries into a fucking laptop. Nintendo doesn't even glue batteries into its portable consoles, and Nintendo is positively full of ideas for locking people out of their stuff. It's not technically expensive to unglue it -- apparently you just use isopropyl alcohol -- but dear God is it a pain. You ever tried to peel an old sticker off of something? It's that, only the sticker costs $100-200, is stuck to something that would cost $1200 to replace, and might catch on fire if you bend it too hard.
One of the reasons people who live in poverty end up buying low-cost, potentially low-quality alternatives is that it's no use owning a higher-quality thing that can be repaired if the repair costs more than a cheap replacement. It's another branch of Vimes' theory of economic unfairness. Nice leather boots can last a very long time, if you keep up with basic maintenance like new soles. Which you can't, if the re-sole costs more than just buying another pair of made of plastic and cardboard. (It does. I've tried.) You can drive a Volvo to the moon and back, if someone in the family knows how to fix a Volvo; otherwise you just buy a gently-used Hyundai Accent and pray it doesn't fall to pieces on the freeway. I've got a lot of high-end clothing from Goodwill, but if I didn't know how to tailor and repair it myself I'd be buying crap at Forever 21 like everyone else.
Comments
Post a Comment