> I still don’t like macOS and would prefer to run Linux on this laptop. But Asahi Linux still needs some work before it’s usable for me (I need external display output, and M4 support). This doesn’t bother me too much, though, as I don’t use this computer for serious work.
“I don’t use this computer for serious work.” Dropped $3K on MBP to play around with. Definitely should have gotten MBA
I don't think you can say that -- I paid about that for my 2021 M1 Max with 64GB and I'm still using it four years later as my main machine. There's an argument to be made to buy an expensive computer every 5 years or so rather than a cheaper one that you need to replace every 2 years because it's become unbearably slow.
Same here: I paid about twice as much for my 2013 Mac Pro that I’ll probably keep using until I replace it with an M5 Mac Studio at some point next year, which I’ll then plan to use for at least 5 years.
As for camera lenses, I expect my collection of manual focus F-mount Zeiss primes to have a longer useful life than their owner.
same here; I bought a M2 Max with 96GB of RAM almost 3 years ago, for €4K, but a client paid half of it for a 1 year retainer. This machine is still the best thing i've worked with, and I have zero intentions of switching this machine anytime soon (i'll probably need to replace it's battery in the future). Rather keep the same machine for 5 or 6 years than to buy a crappier one every 2 years
My laptop is still a 2012 MBP. Granted I don’t use a laptop as my main computer, I use a hackintosh desktop. I might finally buy a new laptop in 2026, 14 years is not bad. If my new laptop can last that long I see no problem maxing out the specs at time of purchase.
What does the purchase price have to do with it? Seems like it would entirely depend on circumstances and constraints, rather than cost, how long someone would run something
I reckon it makes some sense for Apple users. You have to be willing (and financially able) to upgrade when Apple says. Apple forcefully obsoletes their products way too quickly to be a viable option if you care about longevity[0]. I have five excellent-condition still-perfectly-working Apple products next to me, none of which have current operating system support from Apple.
[0] EDIT: for reference, my previous ThinkPad lasted me 14 years.
14 is a indeed very long. Let’s instead assume 12, it’s 2013 and you got a top specced T440 with 4th gen i7. That’s actually not bad and the build quality is like a tank as all Thinkpads. Nothing I would use as daily driver myself but having used many other thinkpads of that generation I can see why others are still getting by with it today.
Since we are talking about OS support. 4th gen Intel isn’t supported by Windows 11, so you’d have to upgrade to Linux.
That's not particularly rational given how quickly computers progress in both performance and cost, a current-gen $1k Macbook Air will run circles around your M1. You'd probably be much better off spending the same amount of money on cheaper machines with a more frequent upgrade cadence. And you can always sell your old ones on eBay or something.
There are other factors to consider such as screen size, storage and RAM, connectivity and ports, active versus passive cooling (thermal throttling), and speaker quality. Additionally, the M1 Pro GPU benchmarks still outperform the latest M4 Air.
For example if I spec out a 13" M4 MBA to match my current 14" M1 Pro MBP, which with tax came to ~$3k in 2021 (32GB RAM, 1TB storage), that $1k MBA ends up being ~$1900. Now that more frequent upgrade cadence doesn't make as much sense financially. After one purchase and one upgrade, you've exceeded the cost of the M1 Pro MBP purchase.
Overall I don't disagree with your sentiment, especially for more casual use cases, but progress will never stop. There will always be a newer laptop with better specs coming out. I personally would rather beef up a machine and then drive it until it dies or can no longer perform the tasks I need it to.
i like using computers until they break on me, i've never really felt (for the usage i give my macbook) that it is lacking in power. Even after, what, 5 years?
i think i'll be upgrading in the next 2 or maybe 3 years if apple puts OLED screens on their new machines as it is rumored.
Respectfully, this is also bullshit for my use case. For me, the M1 purchase was a step up compared to Intel; the rest is diminishing returns for now.
It’s also not true if you care about certain workloads like LLM performance. My biggest concern for example is memory size and bandwidth, and older chips compare quite favorably to new chips where “GPU VRAM size” now differentiates the premium market and becomes a further upsell, making it less cost-effective. :( I can justify $3k for “run a small LLM on my laptop for my job as ML researcher,” but I still can’t justify $10k for “run a larger model on my Mac Studio”
M2 here also, still flies for cross platform mobile development. The 250GB storage space is a bit tight without external storage but my dev environment is lean and purges caches every day so I manage easily.
Just off the top of my head in hobbies that I've been in/around that this $3k would be a nothing burger: photography, wood working, grease monkey, cycling, gun collecting, antiquing, recreational substances...
Fiberglass sailboats last forever and the hobby is dying as people age out of it. I’m in the sailing community and get offered nice free boats in usable condition every year, but already have 2 so refuse any more. This year alone I’ve turned down both a 40ft and a 23ft free boats from 80-90 year old friends that aged out. Parts are expensive, but if you can do repairs yourself, you can absolutely own a pretty nice sailboat for about what it costs for a new apple laptop. I paid $1800 at auction for my most recent sailboat and it is only 7 years old, and needed nothing. Did an overnight trip on it recently.
I want to find a way to revive the hobby by showing younger people short on money that they can get into sailing for less than they already spend on much less rewarding stuff like app subscriptions and smartphones.
Well, there’s hobbies and there’s a buying addiction that comes with a hobby.
In many areas there’s a tendency to overdo it with tools, gadgets and also to compensate for lack of skill with more gadgets. I do woodworking for example and my total spend for industrial vacuum, different types of power and hand tools, work bench, clamps, etc probably comes to around a few thousand EUR. Mine is a really good set-up for a hobby, but I still don’t have any stationary machines or fancy separate work area or room.
I bought everything over the years and I only buy brand-name. My point is, this is actually a lot of money especially if spent as lump sum and not at all a “nothing-burger”.
Knitting / crocheting / quilting / embroidering? Drawing / painting / calligraphy? Singing in a choir? Creative writing / journaling / blogging? Solving crossword puzzles? Bird watching? Day hikes? Reading? Visiting museums? Learning about history / philosophy / art / whatever? Learning a language? Taking dance classes? Playing chess or petanque or any other game that doesn’t require expensive gear? Or most sports?
A lot of things are cheap to taste — a second hand bike and some $200 running shoes and you’re training for a triathlon. Or a makerspace membership and you’re now sewing or doing 3d printing.
It’s once you get “serious” and need to have your own equipment that all these things get real. Or in the case of things like social dance, you want to take time off with and travel further and further away to attend pricey exchanges and camps.
It’s perfectly possible to enjoy hobbies deeply without getting “serious” in the way you describe.
I’ve taken my 10 euro dance classes for years without feeling the necessity of pricey exchanges and camps.
My neighbour goes to the park many evenings to play petanque, doesn’t cost him anything.
A couple I’m friends with goes on day hikes where they do bird watching—maybe they bought a nice pair of binoculars once? Another couple likes to lay jigsaw puzzles together, not exactly breaking the bank!
My sister is learning Finnish because she never learned a non indo-european language. She bought a book.
I would wager most people’s hobbies are low key like this because either they don’t have disposable income to spend on them, or they don’t want too!
Absolutely yeah, and regardless of whether it ends up eventually being expensive, I think part of what I’m saying is that it is important to know how to at least start something cheaply.
I get very frustrated with the kind of people who see one tiktok about a thing and suddenly feel like they need to spend $3k to pursue whatever their new passion is.
Besides programming, my hobbies are writing stories, writing and recording songs, drawing, and painting. None of them needs to cost anywhere near $3000. Any of them can cost as much as you want.
Take the music hobby as an example. I have several expensive guitars now, but in the first 20 years of that hobby I probably spent under $1000 on guitars and related gear the entire time.
I mean if you ranked all the hobbies in terms of cost, casually spending $3k on a laptop would be near the top of the list. But there are a small number of hobbies that are vastly more expensive.
The distribution is highly skewed. Like wealth. The 99th percentile are near the top in rank (by definition) but nowhere near the top in absolute terms.
I can not think of many hobbies which are less expensive if you are serious about them. Some hobbies around me, where $3000 wouldn't get you far: Motorcycles, cars, cycling, collecting anything, woodworking, machining, music making, traveling, horses,...
The cycling industry does a hard work making sure people think they need expensive bicycles but you can perfectly enjoycycling as a hobby without spending a fortune on it.
And in contradiction to computers, a bicycle from 40 years ago still does the same job as it did at the time, there is no software making it incompatible and it doesn't feel slower than the more modern stuff. All you need is a set of brake pads, cables, tires, chain and cassette every once in a while. All these consumables are fairly cheap if you aren't chasing the newest/highest end tech and stick to 2x9 / 2x10 speed transmissions.
Some of those, like horses, are 1% hobbies. But many of the others can be done very affordably. Buying used equipment, learning from YouTube and online resources, starting small and scaling gradually make most of those hobbies accessible at a fraction of the cost.
I can think of dozens. Running, dance, knitting, painting, woodworking (you can go very far for much less than $3k), archery, chess, board games, drawing, painting, brewing, darts, cycling, etc. etc.
Obviously you can spend pretty much any amount of money on those if you want (if you are "serious" about it) but you don't have to and most people don't. Also he said this $3k expenditure wasn't for serious work.
I interpreted it as: if you include all hobbies and games made by humans in history, I'm pretty sure most of them involve a set of cards made of paper, some others involving wooden figurines (chess, checkers) or even drawing on dirt with a stick.
A computer is many, many orders of magnitude more complex and expensive than that.
This isn't said with the intention to demonize expensive hobbies if no one is harmed because of it.
But I do sometimes wonder if my hobbies are too dependent of a power plug. Even reading, which I do with a e-reader.
It's sad that more countries outside of North America haven't actively developed their general aviation industries. It's never going to be cheap (or safe) but there's no good reason to impose the high taxes and regulatory constraints that keep it should be out of reach from regular upper-middle class people in many countries.
It’s a doable common hobby for middle class Americans. I grew up in a rural area with a dirt airstrip and everyone owned planes- even people that could barely afford a reliable used car. You can sometimes find something like an old Cessna for about $20k, and if you’re willing to do “experimental” planes that you fix yourself, sometimes just a few $k. Like anything, if you’re an insider in the community you can get good deals, sometimes even free from friends that age out, etc.
Many universities in rural areas have student clubs that offer lessons and rent club owned planes for cheap.
> even people that could barely afford a reliable used car. You can sometimes find something like an old Cessna for about $20k,
Not sure what you call a "reliable used car". My low mileage for its age 2006 Mercedes B200 costed me 5.5k€ for instance. A car doesn't have to cost a lot to be reliable.
Around me $20k is an expensive price for a car and most people buy second hand +20y old cars they buy for less than 5k€.
Worked hard, won a lottery, whatever. It mostly says that these are people with tens of thousands to burn on fun stuff, and such people are a rather narrow slice of the population. There's nothing bad about that, it's just a rather niche community, whose opinions may not be very relevant for the large majority of people outside that niche.
HN is that niche community though. HN is a forum targeting a niche community that skews technical no matter where someone is physically from, and that community skews relatively rich. Concern trolling that there are starving kids in Africa when there are literal billionaires posting here; I mean sure, I'm not saying we shouldn't say something for fear of their feelings. Nor am I saying that everyone here must be rich in order to comment her. Just that some members of the niche community can recognize are inordinately rich. Advertising eg the Volonaut here will likely generate a couple of sales, and if you thought a $3k laptop was a lot, definitely don't look that one up.
This whole discussion is weird. For the majority of the world's population, dropping $3K on a computer is a non-starter, if even possible. Over six hundred million people cannot even afford proper food and shelter. But there are also sixty-two million millionaires in the world. So there are a large number of people who can buy a MBP without even blinking. We've just discovered income disparity. What the heck does that obvious truth have to add to a review of a MBP?
The "mac community" is even worse. I recently spend $4k on linux laptop, and I get endless criticism, that it is "too expensive" for a "windows pc". I need spec for my work, and comparable mac is 4x more expensive!
Computers are actually cheap as far as Swiss taxes go (I bought my first MacBook Pro when intel came out at EPFL). I’m sure they got their computer for about the same price as you could get it in Hong Kong. But ya, food, rent, and services are pricey in Switzerland, even if you are just grabbing a croissant at coop.
Jokes aside, electronics is way cheaper here (also thanks to a relatively low VAT) than in most countries - although Apple keeps their prices pretty much the same across the world.
> I don’t notice going back to 60 Hz displays on computers. However, on phones, where a lot more animations are a key part of the user experience, I think 120 Hz displays are more interesting.
I'm always so jealous of these people, 60hz is just so bad for me now and even make me a bit motion sick.
I can see it in everything, moving the window, scrolling, the cursor.
I've made a test for myself. Screen split into two parts, two small squares moving and bouncing. First square moves every frame, second square skips every second frame, but moves 2x. So basically one half of the screen is full FPS, another half of the screen is half FPS. And I implemented it as a "blind test", so I could make a guess and then check it.
For screen with 60 FPS, the difference between 30 FPS and 60 FPS was pretty obvious and I could guess it 100% of the time.
For screen with 144FPS, the difference between 72FPS and 144FPS was not obvious at all and I couldn't reliably guess it at all. I also checked it with a few other persons, and they all failed this simple test.
So now I'm holding firm opinion, that these high-FPS displays are marketing gimmick.
https://pastebin.com/raw/hwR62Yhi here's HTML, save it and open. left click reveals which half is "fast" (full FPS) or "slow" (half FPS), scroll changes speed, F5 generates new test.
For me it's the motion clarity that I notice the most. Higher FPS is just one way to get more clarity though, with other methods like black frame insertion then even 60 fps feels like 240.
Thanks for sharing the test. I'm surprised you aren't able to tell the difference -- I can pretty consistently (90%+) get the right answer to both sides at 120 fps "fast," speeds as low as ~500. At higher speeds it's much easier.
> So now I'm holding firm opinion, that these high-FPS displays are marketing gimmick.
While I agree the jump from 60 -> 140 hz/fps is not as noticeable as 30 -> 60, calling everything above 60 a ”marketing gimmick” is silly. When my screen or TV falls back to 60hz for whatever reason I can notice it immediately, you don’t have to do anything else than move your mouse or scroll down a webpage.
If I hook up an LED to a microcontroller and blink it at increasing frequencies until I stop being able to see it (for me about 85Hz), then if brain hardware is optimized, I shouldn't notice a difference at twice that frequency ala Nyquist sampling theorem?
Pretty cool test, but I wonder how fast you ran them at? I was able to distinguish between full and half after increasing the speed to around ~2000 units.
It's interesting how different people pick up different details. I can't really see the difference between 60Hz and 120Hz for example, but I'm unusually sensitive to bad kerning. The nano texture screen also screams smearing and low resolution to me.
How can you know it is not bias? For what its worth you might have never noticed any difference if you didn't knew they weren't refreshing at the same frequency.
> I'd like to think that those who don't notice the difference have improved brain GPUs that can compensate for ghosting.
Wow. My perspective was those that did notice the difference were more perceptive. Thank you - now I realize there is a completely different take. (I'm not sure that it's helpful mind you... but it gives me something to chew on).
The curse of high standards. I wish I dont notice a lot of things. I wish I can stop thinking about why something that is clearly better hasn't been done.
This is such a weird experience for me. On my phone, I instantly notice going back to 60 from 90 hz. But on my computers and handheld consoles, I don't mind, or even notice, at all.
Major difference is one you're watching something without interacting with it and the other is responding to your action; one you have your gaze relatively still, taking in the entire frame, the other your eyes are tracking an object as you interact with it via some sort of input device.
In tracking motion your eyes/brain can see improved motion resolution (how clear the details are in an object moving across the screen) up to 1000Hz.
Personally I've had concussions and bad screens do make me sick. Even 60hz TVs if I'm sitting somewhat close, particularly for certain content. All the chaos of Dr. Strange / Multiverse was too much for me to watch.
> My ideal MacBook would probably be a MacBook Air, but with the nano-texture display! :)
I agree on the nano-texture display having used one in person for a little bit. It's sort of like an ultra fine matte texture that isn't noticable while using it, but is noticable compared to other devices in the same room. I hope it becomes a more standard option on future devices.
That said, I've used Thinkpads with matte displays and while not as fine, they mostly have the same benefit.
I think my ideal would be a MacBook Air with both the nano-texture and higher 120hz refresh rate the Pro has. With that, I'll trade an extra second of compile time for my rust projects for the smaller form factor.
To be it looked very much like the matte coating on Dell monitors, where bunched up same-color pixels have this "feels like there's a rainbow here but if I focus on it I don't see it anymore" effect. Definitely better than ThinkPad matte, though.
My mom has an M1 air, and its resolution is not great. Everything looks a bit blurry compared with my 4K Dell XPS my wife’s MacBook Pro m4 display. I guess the air’s native resolution means it has to do fractional scaling.
The m1 air native resolution is 2560x1600 and the 'best for display' default is 1280x800, that's 2x integer scaling. But yeah if you have a different resolution set, it'll be fractional and probably a bit blurry in comparison.
Yeah the default doesn't do a 1:1 display to pixel ratio.
Just to be pedantic it is integer scaled (from 1440x900 to 2880x1800 but then resampled down to the native resolution of the MBA 2560x1600 via something better than bilinear).
One thing that wasn't mentioned is the max sustained screen brightness for SDR, which is higher on the M4 Pro (1000 nits) compared to the M4 Air or M1 Pro (500 nits).
There’s an awesome app called Vivid which just opens the HDR max brightness. I use it all the time with my M3 Pro when working outside and I believe it also works on earlier models.
There are so many base features that are inexplicably relegated to 3rd party apps. Like a better finder experience. Or keeping screen on. Or NTFS writing.
NTFS writing isn't that inexplicable. NTFS is a proprietary filesystem that isn't at all simple to implement and the ntfs-3g driver got there by reverse engineering. Apple doesn't want to enable something by default that could potentially corrupt the filesystem because Microsoft could be doing something unexpected and undocumented.
Meanwhile if you need widespread compatibility nearly everything supports exFAT and if you need a real filesystem then the Mac and Windows drivers for open source filesystems are less likely to corrupt your data.
I'll take ntfs-3g over the best implementation of exFAT in a heartbeat. Refusing to write to NTFS for reliability purposes, and thereby pushing people onto exFAT, is shooting yourself in the foot.
At which point you're asking why Apple doesn't have default support for something like ext4, which is a decent point.
That would both get you easier compatibility between Mac and Linux and solve the NTFS write issue without any more trouble than it's giving people now because then you'd just install the ext4 driver on the Windows machine instead of the NTFS driver on the Mac.
> There are so many base features that are inexplicably relegated to 3rd party apps.
> Like a better finder experience.
> Or keeping screen on.
Do you mind linking or naming which tools you use for those 2 purposes?
Asking out of pure curiosity, as for keeping the screen on, I just use `caffeinate -imdsu` in the terminal. Previously used Amphetamine, but I ended up having some minor issues with it, and I didn't need any of its advanced features (which could definitely be useful to some people, I admit, just not me). I just wanted to have a simple toggle for "keep the device and/or display from sleeping" mode, so I just switched to `caffeinate -imdsu` (which is built-in).
As for Finder, I didn't really feel the need for anything different, but I would gladly try out and potentially switch to something better, if you are willing to recommend your alternative.
I use the Finder and Raycast heavily. Raycast is not, and does not sell itself as, a Finder equivalent.
OP: I've tried all the Finder replacements. Path Finder, for example. At the end of the day, I went back to Finder. I always have a single window on screen with the tabs that I use all day. This helps enormously. I show it on YouTube here (direct timestamp link): https://youtu.be/BzJ8j0Q_Ed4?si=VVMD54EJ-XsxkYzm&t=338
Finder is the number one reason it boggles my mind that people claim macOS as head and shoulders above other OSes "for professionals". Finder is a badly designed child's toy that does nothing at all intuitively and, in fact, actively does things in the most backwards ways possible. It's like taking the worst of Explorer (from Windows XP), and smashing it into the worst of Dolphin or Nautilus; and, to top it off, then hiding any and all remaining useful functionality behind obscure hot keys.
It has been more or less the same as long as I've used it (20 years or so). Familiarity is a plus. It is a pretty simple and straightforward tool. I'm not sure what you might find perplexing about Finder.
Welcome to the Mac ecosystem. Where basic functionality is gated behind apps that Apple fans will tell you "are lifesavers and totally needed in Windows/Linux/etc)" for $4.99-14.99/piece. And, when they get popular enough, Apple will implement that basic functionality in its OS and silently extinguish those apps.
And that's when they let you modify/use your OS the way you want.
A far as I understand Windows only has a toggle for HDR on vs off, that's not what we're talking about here, this is about forcing the full brightness of HDR always, even outside videos. It's something that manufacturers don't allow for as it reduces display life, it would actually be an anti-feature for a consumer OS to expose as a setting. It'd be like exposing some sort of setting to allow your CPU to go well beyond normal heat limits.
I don't mind that. 3rd party Mac utilities are nice: well designed, explained and do what they're supposed to because someone makes a living of it. I'm happy to pay these prices.
I would personally be afraid of using that in case it causes damage long-term to the screen either due to temperature or power draw or something. Idk if there are significant hardware differences but in this case I would guess there’s a real hardware reason for it?
I imagine what those custom brightness apps do is not magically increase the brightness, but change the various pixels' brightness in accordance to some method/algorithm such that you see what appears to be brighter whites when placed next to certain other colors.
It's not what is implied by the parent post - where the mac is limiting the brightness only to have the app unlock it.
No, I believe the issue is Apple limits the top half or so of the brightness/backlight level for HDR content only. The apps allow it to be used for normal non-HDR content.
...I'd have to say that seems like a radical reading of the text.
No; you can adjust screen brightness just fine with the built-in settings, including with the F1 and F2 keys (plus the Fn key if you've got them set that way).
This Vivid app is specifically for extra HDR levels of brightness. I've never had a problem with my M1 or M4 MBPs, either inside or outside, with the built-in brightness levels. (But, to be fair, I don't use it outside a lot.)
It's classic Apple to spend over a decade insisting that that glossy screens were the best option, and then to eventually roll out a matte screen as a "premium" feature with a bunch of marketing around it.
Historically, traditional matte screen finishes exhibited poor optical qualities by scattering ambient light, which tended to wash out colors. This scattering process also affected the light from individual pixels, causing it to refract into neighboring pixels.
This reduced overall image quality and caused pixel-fine details, such as small text, to appear smeary on high-density LCDs. In contrast, well-designed glossy displays provide a superior visual experience by minimizing internal refraction and reflecting ambient light at high angles, which reduces display pollution. Consequently, glossy screens often appear much brighter, blacks appear blacker without being washed out, colors show a higher dynamic range, and small details remain crisper. High-quality glass glossy displays are often easy to use even in full daylight, and reflections are manageable because they are full optical reflections with correct depth, allowing the user to focus on the screen content.
Apple's "nano texture" matte screens were engineered to solve the specific optical problems of traditional matte finishes, the washed-out colors and smeary details. But they cost more to make. The glossy option is still available, and still good.
I used to have a 2006 macbook pro with the matte screen. It was glorious. None of these issues were present or really noticeable. Maybe you'd notice it in lab setting but not irl. Kind of like 120hz and 4k; just useless to most peoples eyes at the distances people actually use these devices. I've only owned matte external monitors as well and again, no issues there.
The glossy era macbooks otoh have been a disaster in comparison imo. Unless your room is pitch black it is so easy to get external reflections. Using it outside sucks, you often see yourself more clearly than the actual contents on the screen. Little piece of dust on the screen you flick off becomes a fingerprint smear. The actual opening of the lid on the new thin bezel models means the top edge is never free of fingerprints. I'm inside right now and this M3 pro is on max brightness setting just to make it you know, usable, inside. I'm not sure if my screen is actually defectively dim or this is just how it is. Outside it is just barely bright enough to make out the screen. Really not much better than my old 2012 non retina model in terms of outdoor viewing which is a bit of a disappointment because the marketing material lead me to believe these new macbooks are extremely bright. I guess for HDR content maybe that is true but not for 99% of use cases.
I can't go back to the low contrast and washed-out look of matte screens unfortunately. The nano texture isn't terrible but I'd only use it if I had to work with a bright window or other lighting source behind me. If you go to an Apple store you can A/B test glossy vs. nano-texture and glossy wins for me.
Yeah, what on earth. Go back to one of these old displays, I guarantee you want to gouge your eyes out at how terrible they are. 2006 should put you firmly in 720p land.
120Hz is absolutely a noticeable improvement over 60Hz. I have a 60Hz iPhone and a 120Hz iPhone and the 60Hz one is just annoying to use. Everything feels so choppy.
I believe refresh rate/FPS is one of those things where it doesn't really matter but human eyes get spoiled by the higher standard, making it hard to go back. I never saw issues with 30 FPS until going to 60, etc. Hopefully I never get a glimpse of 120 or 144Hz, which would require me to throw out all existing devices.
I'm not convinced. I have an iphone 14 pro which has a 120 Hz screen. I can absolutely see the difference when scrolling compared to my older iphone 11 or computer screens.
However, I'm typing this on my Dell monitor which only does 60 Hz. It honestly doesn't bother me at all. Sure, when I scroll long pages I see the difference: the text isn't legible. But, in practice, I never read moving text.
However, one thing on which I can't go back is resolution. A 32" 4k screen is the minimum for me. I was thinking about getting a wider screen, but they usually have less vertical resolution than my current one. A 14" MBP is much more comfortable when looking at text all day then my 14" HP with FHD screen. And it's not just because the colors and contrast are better, it's because the text is sharper.
I can't tell at all when my mbp is in 120hz or 60hz. I tried to set up a good test too by scrolling really fast while plugging and unplugging the power adapter (which kicks it into high power 120hz or low power 60hz).
One of those things that some people notice, some people don't. I'm definitely in the camp where I feel differences between 120hz and 60hz, but I don't feel 60hz as choppy, and beyond 120hz I can't notice any difference, but others seemingly can. Maybe it's our biology?
Basically everyone who has played videogames on pc will notice the difference. I easily notice a drop from 360Hz to 240Hz.
I also use 60Hz screens just fine, saying that getting used to 120Hz ruins slower displays is being dramatic. You can readjust to 60Hz again within 5 minutes. But I can still instantly tell which is higher refresh rate, at least up to 360Hz.
We're talking about monitors here, which usually have a mouse cursor on it for input. Of course it would be hard to tell between 60 vs 120Hz screens if you used both to play a 30FPS video.
60 to 120? Generally there are tell tale signs. If I quickly drag a window around it’s clear as day at 120.
Most people who’ve used both 60 and 120 could tell, definitely if a game is running. Unless you’re asking me to distinguish between like 110 and 120, but that’s like asking someone to distinguish between roughly 30 and 32.
North of 120 it gets trickier to notice no matter what IMO.
It's super easy, put your finger on a touchpad and move it fast in circle so that the cursor also moves in circle. As the eye is not that fast, you will see multiple faint mouse cursors images. With 120 Hz there will be twice more cursors than with 60 Hz.
On a perfect display you should see just a faint grey circle.
Another test is moving cursor fast across the white page and tracking it with eyes. On a perfect display it should be perfectly crisp, on my display it blurs and moves in steps.
So basically on a perfect display you can track fast moving things, and when not tracking, they are blurred. On a bad display, things blur when tracking them, and you see several instances otherwise. For example, if you scroll a page with a black box up-down, on a bad display you would see several faint boxes overlayed, and on a perfect display one box with blurred edges.
You could replicate a "perfect display" by analytically implementing motion blurring (which is really just a kind of temporal anti-aliasing) in software. This wouldn't let you track moving objects across the screen without blur, but that's a very niche scenario anyway. Where 120hz really helps you is in slashing total latency from user input to the screen. A 60hz screen adds a max 16.667ms of latency, which is plenty enough to be perceived by the user.
I think it’s more noticeable if you are touch interacting with your screen during a drag. If you are scrolling using the mouse, you might not realize it at all like if you were scrolling with your finger.
At the distance I look at my TV screen (about 7 feet from the couch) I can't make out the pixels of the 1080p screen. 4k is lost on me. 2020 vision but I guess that is not enough.
Used to have a 27" 2560x1440 monitor at home. Got a 4K 27" at work, and when I got home, the difference was big enough that I (eventually) decided to upgrade the home monitor.
I also have perfect vision in terms of focal length - but it turns out I have astigmatism in opposite axises in both eyes.
Glasses make a huge difference when watching TV, and are the dividing line between being able to tell the difference between 4K and 1080p and not being able to discern any.
I have the last gen 27” 5k iMac with nano texture as my primary monitor these days and you can immediately tell the difference between image quality, compared to a glossy MacBook pro. Don’t get me wrong, it’s by far the best quality matte finish I’ve ever seen and I would buy it again, because it works great in a room with south-facing windows, but it definitely affects the overall image quality noticeably.
I still have my 2011 17" MacBook Pro, built to order with pretty much every available option available at the time, including the matte screen.
While it serves a useful purpose by diffusing unavoidable point light sources in uncontrolled environments, it's honestly not much of an improvement over its glossy contemporaries in sunlight and other brightly-lit environments, as diffusing already diffuse reflections has little effect.
I have a 2013 MBP retina with glossy screen and a 2020 HP with a matte screen.
What I've found, is that inside, the HP is much better at handling reflections. However, outside, the screen gets washed out and is next to unusable. Whereas on the MBP, I can usually find an angle where reflections don't bother me and I can spend hours using it.
Your 2006 MacBook was pre-retina, a.k.a. High-resolution, displays though. Any kind of smearing effect probably improved the perception of the image because it masked the very visible pixels in the LCD
I also was matte in 06, and had that machine for 9 years (until it was stolen :/). Only option was glossy for my replacement, I was devastated. A few machines later now, I can’t imagine going back.
The 2006 would probably have had 1080ish resolution. I think the GP's point is that at higher resolutions, matte has tended to have the issues they cited.
I am with you in preferring matte. For me, mostly because of reflections on glossy screens.
Even at ~100 dpi, the grainy character of matte coatings from that era was noticeable; my 2006 iMac and a Dell Ultrasharp from a few years later were both unmistakably grainy in a way that glossy displays are not. At the time, the matte coatings were an acceptable tradeoff and the best overall choice for many users and usage scenarios. But I can imagine they would have been quite problematic when we jumped to 200+ dpi.
To each their own but I have a matte M4 Pro and I don't like it, and the screen is noticeably worse than my glossy M2 Pro.
There's a graininess to the screen that makes it feel a little worse at all times, meanwhile I never had a problem in daylight just cranking brightness into the XDR range using Lunar.
It's especially noticeable on light UIs, where empty space gets an RGB "sparkle" to it. I noticed the same thing when picking out my XDR years ago, so it seems like they never figured out how to solve it.
> Unless your room is pitch black it is so easy to get external reflections
This is nearly my preferred setup, only I have wall lights on the wall behind the monitors so it's not truly a dark room (which is horrible for your eyes). No over head lights allowed on while I'm at the keyboard.
If all that is true, why do professional photography monitors pretty much exclusively have matte finishes. Same for monitor used by video, CAD or 3d professionals.
You guys need to stop reading apple advertisement material and take it for gospel just because it has some fancy scientific words in it.
Interesting, given that in the older days of analog dark room development, you had to use a special kind of paper and heat-press it against a polished surface when drying to get a glossy photo.
I always thought matte photos were more readable, but glossy used to be more wow and have “deeper blacks”.
I tend to do outdoor things outdoors, so occasionally cranking up brightness is not an issue.
I'd much rather do that than to have a granier screen with worse viewing angles all the time I'm not in direct sunlight, so next time around I'll be back on glossy.
All of what you say is kind of sort of true in the sense that, if you are in a room with lots of off-axis light hitting your screen and darkness behind you and you yourself are not brightly lit, then the glossy screen is better. And the glossy screen is certainly sharper.
But if there’s a window or something bright behind you, the specular reflection from the glossy and generally not anti reflective coated screen can be so bright and so full of high frequency details that it almost completely obscures the image.
And since I might be trying to work involving text in a cafe as opposed to doing detailed artistic work in a studio, I would much prefer the matte surface.
Do you prefer glossy paper work? glossy book pages? glossy construction documents? The preference for a non-reflective surface for the relaying of dense information has been established for decades.
You know what's glossy? Movie posters and postcards.
ooh, my feathers were a bit ruffled (for reasons unrelated) when I wrote the above.
I still say for comfortable all day viewing and productivity, there is no comparison. Glossy does have more pop on a phone or watching movies in the dark, but I'd go blind doing that all day every day..
non-reflective surfaces you cite have pigments on TOP. screens have depth causing parallax and light spreading. Your point would be valid if screens were paper-thin and image pixels came out the very surface
Kind of a cool thing about being nearsighted. Without glasses, I can get very close to things and still focus on them, i get to see very small details.
Somebody drank its portion of cool aid for sure. There is that little detail that glossy screens needed absolutely perfect conditions in front of them to not reflect literally whole world, making resulting visuals often subpar to matte. I have never, ever been in work conditions in past 20 years that didn't manifest this in annoying and distracting way.
I haven't seen a single display that ever overcame that properly for long term work. Sure, phones use it but they increased luminosity to absurd level to be readable, not a solution I prefer for daily long work.
I admit there are corner cases of pro graphics where it made sense (with corresponding changes to environment) but I am not discussing this here.
Hi! I don't think I have any way of convincing you, but I'm not an AI. Also, randomly accusing people of being an AI is fairly offensive, in case that's not obvious.
Sounds like Apple marketing wankery. I have a matte high density LCD from 2013 (Lenovo) that looks great. Does Apple even make the displays? What exactly are they "engineering" here?
The coatings, which do matter quite a bit when you are optimising for some durability/optical quality tradeoff.
Glass covers make screens more durable, but imply internal and external reflections. Laminated screens on glass panes solves the internal reflections and improve transmission, but do not help with glare and external reflections. Those can be improved by texturing the glass, but at the cost of diffraction and smearing, leading to a decrease in effective resolution. Unless the texture becomes small enough, but then you need it to be durable enough to avoid being wiped or damaged by things that might come into contact with the screen.
It turns out that there is a lot more than the bottom layers that matter in a display. You can see all these problems being solved in succession when looking at the evolution of Apple’s displays over the years (and others’, but it is much easier to find information about the good and bad sides of any Apple product). It’s fascinating, actually.
[edit] add the issue of oils on the human skin and you have do deal with oleophobic coatings for touch screens, which is another very important factor to consider. In addition to how the touch sensors are integrated.
If anything, Apple was right back then.
Glossy has so many benefits for the places where you’d use a computer, it’s not even close. Having the option to pay premium for those few that work in environments where matte helps them makes sense. I’d pay money for my display to not be matte.
Apple was actually late to the glossy display party. HP and Dell moved to them a few years before Apple. I don't think Apple was "insisting" on them, but rather following an industry trend that they were late to.
I wonder if they will (re)introduce premium keyboards with sculpted keys that self-center your fingers someday. magsafe coming back was nice, maybe more extra ports?
MagSafe + SD card reader + headphone jack + USB-C/TB4 only ports is fine by me. In 2025, I'm well past needing USB-C to USB-A dongles. We've had since what 2015/16 to start the conversion to C only.
They are really good at selling a small quantitative improvement that causes them to start using something, as a new type of thing going from impossible to possible. As if the tech didn’t just didn’t exist before Apple started using it.
It is probably a pretty good screen, though.
I don’t really like Apple overall. But, to some extent, it’s like… well, maybe that’s a good way of selling incremental engineering improvements.
i recently worked with a macbook pro and it caused uncomfortable feelings of eyestrain. i had some app that was supposed to disable the temporal dithering but i'm not sure if it helped. i'm curious if there's anyone else on here like me who has experienced eyestrain with macbooks where the nano texture display has helped.
> It's classic Apple to spend over a decade insisting that that glossy screens were the best option
I don't recall Apple ever "insisting" anything about glossy vs. matte. They simply eliminated the matte option without comment, and finally brought it back many years later.
If you have a reference to a public statement from Apple defending the elimination of the matte option, I'd like to see it.
To be clear, I've been complaining about glossy Macs ever since matte was eliminated, and I too purchased an M4 MacBook Pro soon after it was available.
It's indisputable that glossy displays have advantages over matte displays. It's also indisputable that matte displays have advantages over glossy displays, most importantly, fewer reflections of ambient light. The choice is a tradeoff.
A sentence in a PR that highlights an indisputable advantage of a glossy display does not position glossy as being superior overall but merely superior in the respects mentioned, which is not controversial.
Moreover, Apple continued to offer a matte display in the MacBook Pro for years after that PR, so why would they sell an "inferior" option?
> They simply eliminated the matte option without comment, and finally brought it back many years later.
Wasn’t the matte option that disappeared just then removing the glass in front of the screen? I seem to remember that (my MBP from that time was glossy).
The nano textured coating they are using now is quite complex and I am not quite sure it was applicable at such scales cheaply enough back in 2015.
I don't think this is exactly accurate. The matte was a ~$80 upgrade option after they released the glossy. I definitely preferred the matte screens and still do. For coding reducing glare in uncontrolled environments is way more important to me than color fidelity, but to each their own.
It's certainly on brand for Apple to face widespread criticism in the past for having matte screens as the default (computer magazines of the day found that matte finishes made screens too dim) only to face renewed criticism for dropping the thing they were previously criticized for.
It’s classic Apple commenter not know about Apple. They offered matte display upgrades to the MacBook Pro almost 20 years ago. The current glossy black display only became a product line wide choice with the retina displays in 2012, likely because they didn’t prioritize getting an appropriate matte glass finish on the retina screens due to low demand.
I can make the same argument about you. Matte display was the standard prior to Unibody MacBook Pros in 2008.
Glossy was an available option, but not the product line wide choice.
The top of the line Late 2008 MacBook Pro (not Unibody) included:
> An antiglare CCFL-backlit 17" widescreen 1680x1050 active-matrix display (a glossy display was offered via build-to-order at no extra cost, and a higher resolution LED-backlit 1920x1200 display also was offered for an extra US$100).
An frequently overlooked point is the display brightness. The pro models offer 1600 nits peak brightness, which makes these good units for looking at HDR content, especially if you like to take photos or edit videos. Meanwhile the Air maxes out at 500 nits, so the effect and contrast is drastically reduced for those models.
Contrast is significantly poorer on the Air display, and HDR is already in your own photos if you have a modern smartphone, so the idea that it’s niche or irrelevant is a naive take.
The perceptual difference between sdr and hdr isn’t a minor bump, it is conspicuous and driver of realism.
If one cares about the refresh rate of their screen, then they’d trivially notice the improvement that high nit displays provide.
> and these being mini-LED displays, contrast is already infinite.
I think you may have mixed up mini-LED backlighting with OLED and microLED displays. mini-LED backlights merely allow for better local dimming of the backlight behind an LCD, but the number of independently variable backlight zones is still orders of magnitude smaller than the number of pixels. Over short distances, an LCD with local dimming is still susceptible to all of the contrast-limiting downsides of an LCD with a uniform static backlight (and local dimming brings new challenges of its own).
OLED is the mainstream display technology where individual pixels directly emit their own light, so you can truly have a completely black pixel next to a lit pixel. But there are still layers and coatings between the OLED and the user, so infinite contrast isn't actually achievable.
microLED is an unsuccessful technology to provide the benefits of OLED without as many of the downsides (primarily, the uneven aging). But nobody has managed to make large microLED displays economically yet, and it doesn't look like the tech will be going mainstream anytime soon.
> but the number of independently variable backlight zones is still orders of magnitude smaller than the number of pixels
The appearance of a lone mouse cursor on a black screen in the dark is mildly amusing for exactly this reason. You can watch as the ghostly halo of light follows it around the screen as you move the cursor.
I'll upgrade my machine when they put an OLED display in it.
> The nano texture display is great at reducing reflections. I could immediately see the difference when placing two laptops side by side: The bright Apple Store lights showed up very prominently on the normal display, and were almost not visible at all on the nano texture display.
This is a quiet boon for those who enjoy working outdoors but find the sun/brightness a problem.
20 years ago I bought a G3 iBook because the hardware was lovely and the system was supported perfectly by stock Debian woody. (Hands up if you remember having to bless your laptop with “holy penguin pee”, part of the output of the yaboot bootloader used in PowerPC systems!)
Times changed and the best hardware for me right now is a Dell XPS from the model lines a few years back that looked like an aluminum sandwich with a black plastic filling. These machines are fantastic but (1) no OLED, (2) now high speed refresh rate, and (3) the keyboard isn’t great.
Could this modern Apple hardware bring me back to Free OS on pretty hardware, or is there something else I should try?
Asahi (Linux) lags quite far behind the latest Apple hardware release. If you want the Linux experience on Apple hardware, I think the best move is full-screen VM. Performance of that is more than good enough, but it does mean you are running a full non-free software stack to get to your free software VM.
I bought one of those iBooks for Debian linux, but I found the resolution was a bit small for X. At the time, I had a thing for non-intel architectures. Prior to that, I had done a lot of work packaging up Debian for Sparc machines. I had access to a wide variety of Sun workstations at my job as a sysadmin at a university.
Can't you run small LLMs on like... a Macbook air M1? Some models are under 1B weights, they will be almost useless but I imagine you could run them on anything from the last 10 years.
But yeah if you wanna run 600B+ weights models your gonna need an insane setup to run it locally.
I run qwen models on MBA M4 16 Gb and MBP M2 Max 32 Gb, MBA is able to handle models in accordance with its vram memory capacity (with external cooling), e.g. qwen3 embedding 8B (not 1B!) but inference is 4x-6x times slower than on mbp. I suspect weaker SoC
Anyway, Apple SoC in M series is a huge leverage thanks to shared memory: VRAM size == RAM size so if you buy M chip with 128+ Gb memory, you’re pretty much able to run SOTA models locally, and price is significantly lower than AI GPU cards
It's always interesting to see users have somewhat strong opinions over fan vs fanless. I could never go Macbook Air again because I've been to hotter climates and do things beyond just using a browser and invariably the keyboard gets too warm for my fingertips. I need the MBPs fans and Mac Fan Control, noise be dammed.
How much of a difference would I see in compute between an M2 and M4 for example? Assuming it’s the same RAM. Did they also make the gpu and neural engine that much better between the two?
Heh, matte; finally. Gloss is such a PITA if you can't control what's behind you, which ironically is a pretty common dev-with-macbook experience. Walking around to different parts of the office. Off-sites. Etc.
I've only purchased matte screen laptops because I only use them for travel. Lenovo pretty much.
Also prefer semi-gloss for my monitors as I work in well lit daylight conditions if I can help it. There have been very high quality semi-gloss monitors for ages now.
I do not like the Apple Nano Texture. 5% of the time it really helps but 100% of the time it just reduces the picture fidelity somehow. When doing visual tasks like video editing, it is just not good.
Is it possible to install previous macOS version on newest macbook model? I see people having terrible experience with macOS Tahoe yet I am considering purchasing a macbook..
No, that's a separate issue. You can upgrade a M4 or earlier machine from 15.6 to 15.7 even today, despite 26.0 being out for a while, so Apple's still signing a 15.x release at the same time as they're offering 26.x releases. (You likely won't be able to downgrade from 15.7 back to 15.6.)
Downgrading a M5 machine to 15.x would be impossible not because of a signing issue but because Apple never released a 15.x build that supported M5 hardware.
I also went for the fantastic nano texture display on my M4, after having glossy my M1. Very happy with the decision as I use the laptop in brightly lit enviroments so appreciate fewer reflections. Going back to a glossy display is a shock.
I was on the fence for same reason - should I get the nano display? I opted for the 15" MBA, and the display has been great. Way better than my 2019 Macbook Pro. I've had zero issues with glare, but I'm also in an office environment during the day and use it at night when home.
I’ve been swapping back and forth between a MacBook Pro and a Linux workstation lately. The input latency difference is insane - macOS is sooo much worse than Linux. It’s gotten to the point that I’m porting code to Linux just so I don’t have to use my editor from macOS.
I don’t know how many milliseconds the difference is, but going back and forth it’s so obvious to me that it’s painful.
lots of people can notice that. my last job involved meticulously timing our software's input-tp-display latency, testing viewers' responses to it, and fighting for each and every ms we should shave off of it.
For my sins, I have recently been called upon to cold boot and then provision a few dozen Samsung tablets by hand. The "laggy Lagdroid piece of lagshit" pasta has been repeated a lot. I swear to God it just ignores ten percent of touch events if it's doing anything in the background.
Fun fact, 1ms is the approximately the amount of time it takes for sound to travel 1 foot. Do musicians move all their speakers to be within one foot of their ears? Do people in a band notice a difference if they're not standing within 1 foot of their partners? No, they don't.
I definitely notice the difference between 10 ms and 26 ms. 26 ms already feel sluggish when playing drums, guitars or keyboard instruments. But there is no way anyone can feel a difference of 1 ms.
That’s audio latency, not musicians doing music. In my experience if you have two musicians that are supposed to be playing unison, 5-6 ms is enough to feel “off”
Anecdotically, 7ms vs 3ms latency is felt as weirdly heavy action when playing midi keyboard. It's not felt as latency, but it's felt. And I bet the difference could be reliably established in double-blind testing (3 samples, find an outlier).
1ms seems less believable, but I wouldn't be surprised, if some people could notice that too.
"My ideal MacBook would probably be a MacBook Air, but with the nano-texture display! :)"
Mine as well. What is the likelihood this will happen?
I have a hunch it will not and they will either scrap the nano texture completely or keep it as differentiator for the Pro line, but I am curious what others think.
Mine too, and I bought an air in the last generation and I barely use it because I thought the 60hz display would be ok, but I've been living with 120's everywhere for long enough the 60hz is actually horrible to use now. First world problems for sure, but it's enough that I literally don't use the machine.
I’ve used MBP for many many years, but recently bought an MB Air. I slightly miss the extra ports. I love how much lighter it is. I never notice a speed difference. I’m always ssh’d into a Linux box if crunching any real data, and for UI stuff the CPU doesn’t need a fan at all. Definitely gonna stick with MB Air.
We used to sell conversion kits to shoehorn a pixel qi display into the thinkpad x230. Since apple has put in 1,000nit displays on the pros, we don't bother anymore. The nano texture sold me and it performs wonderfully outdoors. I hate giving apple money but here I am.
+1 to that. Simply horrendous post-purchase support. Company representatives on all levels, from a simple technician to head of Linux support department, will be lying straight in your face, just to scam a few thousands bucks out of you.
But their keyboards are still the best, and trackpoint is unmatched.
As soon as System76 or Framework or any other vendor offer that, I'm giving them my money.
It's because Apple sucks the least. They still suck, though. They could build decent computers that are upgradeable, but they refuse because they want your $$$$ in large amounts.
I upgraded from M4 to M5 MBP because I broke my M4's screen and so my company ordered a replacement M5 while the M4 is being repaired. I can't really notice a difference at all. It's an absolute work horse, but so was the M4. I _did_ spring for the nano texture display this time around, and that is definitely nice (but nothing to do with the M5)
I have the nano-texture display on my M4. At this point, I don't think I can go back to standard glass. For text work, I find there are no downsides. If you work more with color and detailed art, I think that's the only case where you need to put extra thought into it. Otherwise get it
No, I love it. I had non-matte glass screens in my MacBooks since 2012 and I didn't realize how much better it is to no longer see lights reflected in there all the time.
I got the nanotexture on my current work M4 MBP—it doesn't completely eliminate reflected light, but it diffuses it a lot. If I were in a dark room with a light source positioned perfectly to reflect off my screen in my face, I would probably still have trouble with it, but in general I don't need to reposition the screen to avoid glare nearly as much.
I would say it's worth the extra, what, $200 or so? on the price of the M4 MBP. If it were much more expensive, I would be less sure.
It’s often much more than $200 as the base models can be had for huge discounts, like $450 off retail, but the second you check the nano texture option, you lose the discount and you tack on the extra $200. So it’s often closer to $700 in some cases.
I just upgraded from an M1 to an M5 a couple days ago.
It is rather shocking how much faster everything feels given I didn't think my old macbook pro was slow. While I expected xcode builds to be faster (and they are), I was a bit shocked when opening a new firefox tab was instantaneous since I hadn't noticed it wasn't before.
Another thing I didn't expect is that the new speakers have noticeably more bass and can get quite a bit louder.
I didn't get the nano-textured display, because having to adjust the display angle to get colors to render correctly is more annoying than having to do it for glare (I don't work in a high-glare environment).
funny i was recently picking between a glossy and nano texture screen and came to the opposite conclusion — the glossy screen’s image was so much more crisp, and i didn’t really see much difference in terms of reflection
I have the M4 Max. The fans never really come on unless I launch something that maxes out the GPUs, which I rarely do. I do have some software projects that use all CPUs and maxes those out while they build (all 14 of them). The fans stay silent.
This is, by far, the fastest machine I've ever had. My previous laptop was a more modest M1 mac book pro. And before that I was on a cheapo intel i5 Samsung laptop - a stop gap solution after my last intel mac died when a loose keyboard key destroyed the screen (yep the generation with the crappy keyboards, worst mac I've ever owned). That intel was of course pathetic and shit. I wasn't expecting much and it disappointed me despite that. The M1 was about 3x faster. The M4 Max is a beast. In terms of build speeds, the i5 was unusable while building and would take 15 minutes. The M1 got it down to 5 minutes (10 CPU cores that are faster than the 4 intel ones). But it didn't have enough memory so swapping slowed it down a bit. The M4 max builds stuff in around 30 seconds. No more swapping and the 14 cores are quite a bit faster than the M1 ones. Same project (but of course with a few years of development). We have more tests now, not fewer.
Otherwise it's a great laptop. Keyboard is fine. Touchpad is best in class in the industry (everything else is pathetically mediocre in comparison; it's not even close), the screen is best in class as well (contrast, colors, resolution, everything). And Apple learned it's lesson when it comes to keyboards. Most windows/linux laptops I'm aware off are a compromise between heating/cooling, lousy input and output devices, performance, design, screen quality, etc. Apple nails all of those things. Nobody else does.
High end Macs are not cheap. But for professionals it's a minor expense. If you lease a car for getting your ass to work every morning, you are probably spending 2-3x more at least than what this would cost you. And the whole point of getting to work is to open your laptop and earn a living with it. It's more important than the damn car. It's what pays for that car. I spend less than what used to be 1 hour of my freelance rate per month on this absolute monster. Maybe it's 2 hours for you if you just got started. That's still nothing on 160ish billable hours per month. Employers tend to be less enlightened of course. But if it's your choice, don't be frugal and buy the laptop you need. If a simple browser is all you need, of course get something decent looking like a mac book air or whatever. But otherwise, get the best you can afford. I've compromised once with that Samsung. I did not enjoy that.
The part about noticing web pages loading (at most) 8ms faster due to the display is total nonsense. Many can notice the difference between 60 and 120Hz when scrolling, but definitely not for a page load. That’s less than 1/10th of the blink of an eye.
If page load seems noticeably faster, it’s far more likely that it’s simply a faster machine. Or imaginary.
I still have a 2019 MacBook Pro with the non-butterfly keyboard and escape key (unfortunately still the Touch Bar).
It’s still a great laptop except the battery lasts maybe 75 mins. I just keep it plugged in but despite the fact it’s 6 years old I don’t notice any problems with it.
I’m tempted to buy an M4 laptop just because it’s “new” and “faster” but then I ask myself Why? It’s the same thing with my iPhone. Until my laptop dies or there is something functional that I can’t do with my old laptop I’m going to keep using it.
I have done real work, using a computer 10+ hours a day on every ecosystem, Windows, Linux, Mac. I've used each for ~10 years a piece.
My most recent laptop died and it really showed me what I appreciate in a laptop, performance, build quality, lightweight, good battery, low noise, good ergonomics.
I was sick of the recent overheating generation of pc laptops that don't last more than a couple years under my usage.
As a result I decided to try to switch back to a macbook after a decade hiatus.
The hardware is good but the software is absolute garbage. Trialing it for a week the amount of bullshit that is MacOS was enough, and Asahi wasn't there yet either. Instead I decided to get an AMD framework laptop.
Best decision ever.
I have a laptop that's got great quality, can be upgraded without paying a $5k tax, can replace the keyboard for $100 instead of $700, it works with me rather than against me and my wallet.
Same experience. I cannot consider any screen that does not have the nano texture coating. It is exceptional and a huge improvement. To the point that I actually prefer a tester Samsung Galaxy S25 Ultra over Apple’s own iPhone display.
I couldn't really trust the author of the review after he established his preference for "quiet computers" having no cooling slots or whatever he called them. Okay, fine, you're placing aesthetics above actual performance, then. The Pro laptops are the only ones viable for any really hardcore work because if you push the Air too hard it is going to just slow down in order to stay cool and that's not what you want if you are doing graphics work or in my case, I was running a bunch of containers in K8s. I never bought an Air because it was too similar to an iPad.
The thing that mostly irks me about Apple these days is soldered in RAM and non-upgradeable storage. Apple is still the best thing going for doing most pro development work, but it's just so irritating that they shit on us like this. I did buy an M4 Mini and expanded it some. My 2019 MB Pro is siting here on the desk, mostly unused these days. The Intel Macs are basically dead now--still great computers, but no longer desirable. My daughter is doing Graphic Arts in college and is using another 2019 Pro for that. I've used Macs continuously since at least 2014.
>The thing that mostly irks me about Apple these days is soldered in RAM and non-upgradeable storage.
Isn't the 'soldered-in' RAM and storage fundamental to the M-series architecture? It's not like there's a board with individual chips sitting in it for the RAM and storage, that could potentially have been 'popped out' if they weren't soldered in. It's all one giant 'chip' now.
No, M series is a system on chip (SoC), that’s why it’s able to run local LLM models in a range impossible for other laptop brands: VRAM == RAM, unified shared memory at max speed for both CPU and GPU
We're still waiting to see any CAMM-style memory module show up in a mass market product at any speed, instead of merely getting press coverage where the number of articles written seems to outnumber the number of laptops actually built and shipped. But even if you are willing to take the examples thus far seriously as real products, they haven't come close to matching the speed of soldered LPDDR.
I was considering an LPCAMM2-fitted Thinkpad. I was eyeing to buy one with less memory and then buy a 96GB module to upgrade it. However, the module was nowhere to be found in stock, and where it was found, it was priced almost like the whole laptop.
CAMM is still less effective than in-package RAM bundled with your CPU. The Framework folks looked into using CAMM for their recent AMD APU-based desktop and it was a no go.
After 18 years of Mac-abstinence, I just bought a MacBook Air and realized there is apparently no way to change the App Store language without changing region and payment method. WTF? That seems like the most basic thing one could imagine. What has happened to Apple?
I was able to switch the App Store language from English to Spanish by changing my primary language in System Settings > Language & Region > Preferred Languages.
It didn't require me to switch my region or payment method.
Why did you think Apple was user friendly or flexible...it's the Apple way or the highway. Most only stick around because of the currently superior hardware
macOS does not have auto update. In fact it doesn't bother you with any updates which lead to me behind patches behind because I was accustomed to Windows nagging me for updates every week.
Patently false on modern MacOS. I get a reminder about Tahoe every week or two. Plus a persistent red "1" dot in the Settings app that you can't dismiss. And a huge info/advert panel in the 'Software Update' section of Settings about Tahoe, that you can't dismiss.
i'll never understand picky preferences about monitors... i still use an LG flatron wide that's old enough to vote... and when i slack at the apple store, it's not like i notice some life-or-death difference. a monitor is a monitor.
ok, i guess for graphic designers it might matter more?
Some old LCD displays were quite crisp. Sure, you can see individual pixels. The mouse tail has a clear zig-zag. But I find these nice on the eyes in their own way. I suspect because eyes autofocus more easily.
New super high-res displays are also nice on my eyes. The displays in between, those from the last decade or so, have been hit or miss for me.
If you'd like to change that, you can go to System Settings → Battery → Options → Wake for Network Access
Or just search for "Power Nap" (what it used to be called). They usually wake up intermittently for Time Machine backups, wake-on-lane and other stuff.
I have mine set to `NEVER` [wake for network access] and yet it still makes DNS requests often while asleep.
Curiously, it is able to maintain network connection even through the 1/4" steel of the safe it's stored within. The older Intel MBP doesn't and cannot.
> I still don’t like macOS and would prefer to run Linux on this laptop. But Asahi Linux still needs some work before it’s usable for me (I need external display output, and M4 support). This doesn’t bother me too much, though, as I don’t use this computer for serious work.
“I don’t use this computer for serious work.” Dropped $3K on MBP to play around with. Definitely should have gotten MBA
If you are going to start making a list of expensive hobbies, $3K for a computer isn't going to be anywhere near the top of the list.
It's not the absolute expense, it's the delta over what else would have worked just as well.
The type of person shelling out 3k for a computer is not running it until the wheels come off.
I don't think you can say that -- I paid about that for my 2021 M1 Max with 64GB and I'm still using it four years later as my main machine. There's an argument to be made to buy an expensive computer every 5 years or so rather than a cheaper one that you need to replace every 2 years because it's become unbearably slow.
Same here: I paid about twice as much for my 2013 Mac Pro that I’ll probably keep using until I replace it with an M5 Mac Studio at some point next year, which I’ll then plan to use for at least 5 years.
As for camera lenses, I expect my collection of manual focus F-mount Zeiss primes to have a longer useful life than their owner.
same here; I bought a M2 Max with 96GB of RAM almost 3 years ago, for €4K, but a client paid half of it for a 1 year retainer. This machine is still the best thing i've worked with, and I have zero intentions of switching this machine anytime soon (i'll probably need to replace it's battery in the future). Rather keep the same machine for 5 or 6 years than to buy a crappier one every 2 years
My laptop is still a 2012 MBP. Granted I don’t use a laptop as my main computer, I use a hackintosh desktop. I might finally buy a new laptop in 2026, 14 years is not bad. If my new laptop can last that long I see no problem maxing out the specs at time of purchase.
What does the purchase price have to do with it? Seems like it would entirely depend on circumstances and constraints, rather than cost, how long someone would run something
Tells me they are price insensitive and probably get a new computer every couple of years.
That reasoning does not make any sense. I spend $3-4k on a MBP and run it till it fall apart, usually 5-7 years later.
I reckon it makes some sense for Apple users. You have to be willing (and financially able) to upgrade when Apple says. Apple forcefully obsoletes their products way too quickly to be a viable option if you care about longevity[0]. I have five excellent-condition still-perfectly-working Apple products next to me, none of which have current operating system support from Apple.
[0] EDIT: for reference, my previous ThinkPad lasted me 14 years.
14 years as your main driver ? Because that what we’re talking about.
14 is a indeed very long. Let’s instead assume 12, it’s 2013 and you got a top specced T440 with 4th gen i7. That’s actually not bad and the build quality is like a tank as all Thinkpads. Nothing I would use as daily driver myself but having used many other thinkpads of that generation I can see why others are still getting by with it today.
Since we are talking about OS support. 4th gen Intel isn’t supported by Windows 11, so you’d have to upgrade to Linux.
Out of curiosity, how much of that thinkpad were you able to upgrade? Could that be the difference between 5 and 14 years here?
It makes sense for some people, and doesn't for others. Not particularly surprising or insightful.
>I have five excellent-condition still-perfectly-working Apple products next to me, none of which have current operating system support from Apple.
If they're working perfectly, why does it matter if they have current operating support? It doesn't seem like you're dependent on Apple.
Software drops support for certain OS versions even if the device still can run it.
The first iPad Pro can’t run adobe products for example.
The Mac is a bit more resilient to this, but it’s still worrying as yearly improvements become subtler.
I have the M1 Max. It’s still going hard. Not planning to replace it anytime soon.
Bullshit. I shelled $3k for my MBP M1 back in 2021 and I intend to use it until I can’t anymore.
It depends on the person and the use case. Different personalities etc
That's not particularly rational given how quickly computers progress in both performance and cost, a current-gen $1k Macbook Air will run circles around your M1. You'd probably be much better off spending the same amount of money on cheaper machines with a more frequent upgrade cadence. And you can always sell your old ones on eBay or something.
There are other factors to consider such as screen size, storage and RAM, connectivity and ports, active versus passive cooling (thermal throttling), and speaker quality. Additionally, the M1 Pro GPU benchmarks still outperform the latest M4 Air.
For example if I spec out a 13" M4 MBA to match my current 14" M1 Pro MBP, which with tax came to ~$3k in 2021 (32GB RAM, 1TB storage), that $1k MBA ends up being ~$1900. Now that more frequent upgrade cadence doesn't make as much sense financially. After one purchase and one upgrade, you've exceeded the cost of the M1 Pro MBP purchase.
Overall I don't disagree with your sentiment, especially for more casual use cases, but progress will never stop. There will always be a newer laptop with better specs coming out. I personally would rather beef up a machine and then drive it until it dies or can no longer perform the tasks I need it to.
i like using computers until they break on me, i've never really felt (for the usage i give my macbook) that it is lacking in power. Even after, what, 5 years?
i think i'll be upgrading in the next 2 or maybe 3 years if apple puts OLED screens on their new machines as it is rumored.
Respectfully, this is also bullshit for my use case. For me, the M1 purchase was a step up compared to Intel; the rest is diminishing returns for now.
It’s also not true if you care about certain workloads like LLM performance. My biggest concern for example is memory size and bandwidth, and older chips compare quite favorably to new chips where “GPU VRAM size” now differentiates the premium market and becomes a further upsell, making it less cost-effective. :( I can justify $3k for “run a small LLM on my laptop for my job as ML researcher,” but I still can’t justify $10k for “run a larger model on my Mac Studio”
See https://github.com/ggml-org/llama.cpp/discussions/4167#discu...
I have an M2 Ultra. I don't see myself getting rid of it for another 5 years at least.
M2 here also, still flies for cross platform mobile development. The 250GB storage space is a bit tight without external storage but my dev environment is lean and purges caches every day so I manage easily.
Just wait until they see the price of of a 300/2.8 lens or quantum-tuned rocks to isolate power cables from the floor.
I think it actually would be quite near the top, in terms of ranking. Most hobbies are a lot cheaper.
Of course, not near the top in terms of money because there are a few hobbies that cost vastly more.
> Most hobbies are a lot cheaper.
Sure, but I did specify expensive hobbies.
Just off the top of my head in hobbies that I've been in/around that this $3k would be a nothing burger: photography, wood working, grease monkey, cycling, gun collecting, antiquing, recreational substances...
You can absolutely be a hobbyist photographer for a fraction of $3k. A hobbyist lens collector is a different story.
> photography, wood working, grease monkey, cycling, gun collecting, antiquing, recreational substances
Yacht owner says ‘hold my beer’.
Fiberglass sailboats last forever and the hobby is dying as people age out of it. I’m in the sailing community and get offered nice free boats in usable condition every year, but already have 2 so refuse any more. This year alone I’ve turned down both a 40ft and a 23ft free boats from 80-90 year old friends that aged out. Parts are expensive, but if you can do repairs yourself, you can absolutely own a pretty nice sailboat for about what it costs for a new apple laptop. I paid $1800 at auction for my most recent sailboat and it is only 7 years old, and needed nothing. Did an overnight trip on it recently.
I want to find a way to revive the hobby by showing younger people short on money that they can get into sailing for less than they already spend on much less rewarding stuff like app subscriptions and smartphones.
Well, there’s hobbies and there’s a buying addiction that comes with a hobby.
In many areas there’s a tendency to overdo it with tools, gadgets and also to compensate for lack of skill with more gadgets. I do woodworking for example and my total spend for industrial vacuum, different types of power and hand tools, work bench, clamps, etc probably comes to around a few thousand EUR. Mine is a really good set-up for a hobby, but I still don’t have any stationary machines or fancy separate work area or room. I bought everything over the years and I only buy brand-name. My point is, this is actually a lot of money especially if spent as lump sum and not at all a “nothing-burger”.
I actually can’t think of one hobby that costs less than $3k
Knitting / crocheting / quilting / embroidering? Drawing / painting / calligraphy? Singing in a choir? Creative writing / journaling / blogging? Solving crossword puzzles? Bird watching? Day hikes? Reading? Visiting museums? Learning about history / philosophy / art / whatever? Learning a language? Taking dance classes? Playing chess or petanque or any other game that doesn’t require expensive gear? Or most sports?
A lot of things are cheap to taste — a second hand bike and some $200 running shoes and you’re training for a triathlon. Or a makerspace membership and you’re now sewing or doing 3d printing.
It’s once you get “serious” and need to have your own equipment that all these things get real. Or in the case of things like social dance, you want to take time off with and travel further and further away to attend pricey exchanges and camps.
It’s perfectly possible to enjoy hobbies deeply without getting “serious” in the way you describe.
I’ve taken my 10 euro dance classes for years without feeling the necessity of pricey exchanges and camps.
My neighbour goes to the park many evenings to play petanque, doesn’t cost him anything.
A couple I’m friends with goes on day hikes where they do bird watching—maybe they bought a nice pair of binoculars once? Another couple likes to lay jigsaw puzzles together, not exactly breaking the bank!
My sister is learning Finnish because she never learned a non indo-european language. She bought a book.
I would wager most people’s hobbies are low key like this because either they don’t have disposable income to spend on them, or they don’t want too!
Absolutely yeah, and regardless of whether it ends up eventually being expensive, I think part of what I’m saying is that it is important to know how to at least start something cheaply.
I get very frustrated with the kind of people who see one tiktok about a thing and suddenly feel like they need to spend $3k to pursue whatever their new passion is.
Besides programming, my hobbies are writing stories, writing and recording songs, drawing, and painting. None of them needs to cost anywhere near $3000. Any of them can cost as much as you want.
Take the music hobby as an example. I have several expensive guitars now, but in the first 20 years of that hobby I probably spent under $1000 on guitars and related gear the entire time.
Running. You only need good shoes, really. Words from coworker running marathons.
For me the only one would be sketching/painting. But I agree with the point in general, most hobbies cost a lot.
cross training ?
No, if cross training qualified, those in cross training would be sure to tell you they did cross training and go into details about it
What do you mean "in terms of ranking" vs "in terms of money"?
Median vs mean, essentially, is how I read it.
I mean if you ranked all the hobbies in terms of cost, casually spending $3k on a laptop would be near the top of the list. But there are a small number of hobbies that are vastly more expensive.
The distribution is highly skewed. Like wealth. The 99th percentile are near the top in rank (by definition) but nowhere near the top in absolute terms.
I can not think of many hobbies which are less expensive if you are serious about them. Some hobbies around me, where $3000 wouldn't get you far: Motorcycles, cars, cycling, collecting anything, woodworking, machining, music making, traveling, horses,...
The cycling industry does a hard work making sure people think they need expensive bicycles but you can perfectly enjoycycling as a hobby without spending a fortune on it.
And in contradiction to computers, a bicycle from 40 years ago still does the same job as it did at the time, there is no software making it incompatible and it doesn't feel slower than the more modern stuff. All you need is a set of brake pads, cables, tires, chain and cassette every once in a while. All these consumables are fairly cheap if you aren't chasing the newest/highest end tech and stick to 2x9 / 2x10 speed transmissions.
Some of those, like horses, are 1% hobbies. But many of the others can be done very affordably. Buying used equipment, learning from YouTube and online resources, starting small and scaling gradually make most of those hobbies accessible at a fraction of the cost.
I can think of dozens. Running, dance, knitting, painting, woodworking (you can go very far for much less than $3k), archery, chess, board games, drawing, painting, brewing, darts, cycling, etc. etc.
Obviously you can spend pretty much any amount of money on those if you want (if you are "serious" about it) but you don't have to and most people don't. Also he said this $3k expenditure wasn't for serious work.
There’s some nuance to it.
Judging by the authors preference for Linux, I’m guessing this hobby has some professional applications as well.
$3k is the price of a very nice guitar, but I am not about to casually shell out that money every few years.
However, I earn my wage using a computer, so it’s a lot easier to justify staying relatively current on specs.
I interpreted it as: if you include all hobbies and games made by humans in history, I'm pretty sure most of them involve a set of cards made of paper, some others involving wooden figurines (chess, checkers) or even drawing on dirt with a stick.
A computer is many, many orders of magnitude more complex and expensive than that.
This isn't said with the intention to demonize expensive hobbies if no one is harmed because of it.
But I do sometimes wonder if my hobbies are too dependent of a power plug. Even reading, which I do with a e-reader.
Try general aviation as a hobby. It will be eye opening
Thinking it’s a hobby is an american thing. I’ve never met anyone who do it, but for Kobe Bryant, Harrisson Ford, Tom Cruise it seems normal.
Most people save $400 per month tops, that they spend on holidays.
It's sad that more countries outside of North America haven't actively developed their general aviation industries. It's never going to be cheap (or safe) but there's no good reason to impose the high taxes and regulatory constraints that keep it should be out of reach from regular upper-middle class people in many countries.
It’s a doable common hobby for middle class Americans. I grew up in a rural area with a dirt airstrip and everyone owned planes- even people that could barely afford a reliable used car. You can sometimes find something like an old Cessna for about $20k, and if you’re willing to do “experimental” planes that you fix yourself, sometimes just a few $k. Like anything, if you’re an insider in the community you can get good deals, sometimes even free from friends that age out, etc.
Many universities in rural areas have student clubs that offer lessons and rent club owned planes for cheap.
> even people that could barely afford a reliable used car. You can sometimes find something like an old Cessna for about $20k,
Not sure what you call a "reliable used car". My low mileage for its age 2006 Mercedes B200 costed me 5.5k€ for instance. A car doesn't have to cost a lot to be reliable.
Around me $20k is an expensive price for a car and most people buy second hand +20y old cars they buy for less than 5k€.
> $3K for a computer isn't going to be anywhere near the top of the list.
That says a lot about the community you live in.
That they've worked hard to be able to afford nice things? What do you think it says, exactly? This is a pretty irritating comment.
Worked hard, won a lottery, whatever. It mostly says that these are people with tens of thousands to burn on fun stuff, and such people are a rather narrow slice of the population. There's nothing bad about that, it's just a rather niche community, whose opinions may not be very relevant for the large majority of people outside that niche.
HN is that niche community though. HN is a forum targeting a niche community that skews technical no matter where someone is physically from, and that community skews relatively rich. Concern trolling that there are starving kids in Africa when there are literal billionaires posting here; I mean sure, I'm not saying we shouldn't say something for fear of their feelings. Nor am I saying that everyone here must be rich in order to comment her. Just that some members of the niche community can recognize are inordinately rich. Advertising eg the Volonaut here will likely generate a couple of sales, and if you thought a $3k laptop was a lot, definitely don't look that one up.
That was not a judgment, good or bad. Simply an observation.
Seriously. Stapelberg is a talented guy that's done well for himself, why can't he have nice things if he wants them?
This whole discussion is weird. For the majority of the world's population, dropping $3K on a computer is a non-starter, if even possible. Over six hundred million people cannot even afford proper food and shelter. But there are also sixty-two million millionaires in the world. So there are a large number of people who can buy a MBP without even blinking. We've just discovered income disparity. What the heck does that obvious truth have to add to a review of a MBP?
The "mac community" is even worse. I recently spend $4k on linux laptop, and I get endless criticism, that it is "too expensive" for a "windows pc". I need spec for my work, and comparable mac is 4x more expensive!
Maxed out a mbp, I couldn’t get more than a bit than 8k. And comparable is probably generous.
Is this 16,000 dollar laptop in the room with us now?
to be pedantic, a maxed out MBP is 90200 BRL in Brazil now before AC+ and software, around 16777 USD
just skip going out to lunch once and eat a turkey sandwich instead /s
He lives in Switzerland. $3K barely pays for a lunch and an espresso.
Computers are actually cheap as far as Swiss taxes go (I bought my first MacBook Pro when intel came out at EPFL). I’m sure they got their computer for about the same price as you could get it in Hong Kong. But ya, food, rent, and services are pricey in Switzerland, even if you are just grabbing a croissant at coop.
Jokes aside, electronics is way cheaper here (also thanks to a relatively low VAT) than in most countries - although Apple keeps their prices pretty much the same across the world.
There's a reason the Zurich airport has a vending machine that sells slips of gold, platinum, and palladium
I'm not sure what the actual reason is, but my first instinct is "tax evasion".
This is funny because MBA could mean two things.
MBAs typically use MBAs.
Lmao plus MBA works great for relatively serious work. I was hesitant to switch from MBP but the M1 air almost never lets me down.
> I don’t notice going back to 60 Hz displays on computers. However, on phones, where a lot more animations are a key part of the user experience, I think 120 Hz displays are more interesting.
I'm always so jealous of these people, 60hz is just so bad for me now and even make me a bit motion sick.
I can see it in everything, moving the window, scrolling, the cursor.
I've made a test for myself. Screen split into two parts, two small squares moving and bouncing. First square moves every frame, second square skips every second frame, but moves 2x. So basically one half of the screen is full FPS, another half of the screen is half FPS. And I implemented it as a "blind test", so I could make a guess and then check it.
For screen with 60 FPS, the difference between 30 FPS and 60 FPS was pretty obvious and I could guess it 100% of the time.
For screen with 144FPS, the difference between 72FPS and 144FPS was not obvious at all and I couldn't reliably guess it at all. I also checked it with a few other persons, and they all failed this simple test.
So now I'm holding firm opinion, that these high-FPS displays are marketing gimmick.
https://pastebin.com/raw/hwR62Yhi here's HTML, save it and open. left click reveals which half is "fast" (full FPS) or "slow" (half FPS), scroll changes speed, F5 generates new test.
For me it's the motion clarity that I notice the most. Higher FPS is just one way to get more clarity though, with other methods like black frame insertion then even 60 fps feels like 240.
Thanks for sharing the test. I'm surprised you aren't able to tell the difference -- I can pretty consistently (90%+) get the right answer to both sides at 120 fps "fast," speeds as low as ~500. At higher speeds it's much easier.
You can’t write it off as a marketing gimmick just because you and a few others personally can’t see the difference, many people demonstrably can.
> So now I'm holding firm opinion, that these high-FPS displays are marketing gimmick.
While I agree the jump from 60 -> 140 hz/fps is not as noticeable as 30 -> 60, calling everything above 60 a ”marketing gimmick” is silly. When my screen or TV falls back to 60hz for whatever reason I can notice it immediately, you don’t have to do anything else than move your mouse or scroll down a webpage.
If I hook up an LED to a microcontroller and blink it at increasing frequencies until I stop being able to see it (for me about 85Hz), then if brain hardware is optimized, I shouldn't notice a difference at twice that frequency ala Nyquist sampling theorem?
Pretty cool test, but I wonder how fast you ran them at? I was able to distinguish between full and half after increasing the speed to around ~2000 units.
It's interesting how different people pick up different details. I can't really see the difference between 60Hz and 120Hz for example, but I'm unusually sensitive to bad kerning. The nano texture screen also screams smearing and low resolution to me.
Same. I currently have a 160hz and a 240hz monitor. And I can tell the difference between them when scrolling pages with tons of text.
There's less ghosting in 240hz.
And scrolling on 60hz to me looks blurry.
I'd like to think that those who don't notice the difference have improved brain GPUs that can compensate for ghosting.
How can you know it is not bias? For what its worth you might have never noticed any difference if you didn't knew they weren't refreshing at the same frequency.
Oh for me it's very clear.
Specially between 60hz and 120+.
Scrolling looks blurry/ghosted in 60hz.
I guess I could vibe code an app to set monitor Hz randomly in either 60 or 280 and test.
But it would be a waste of time from how clearly I can tell the difference.
> I'd like to think that those who don't notice the difference have improved brain GPUs that can compensate for ghosting.
Wow. My perspective was those that did notice the difference were more perceptive. Thank you - now I realize there is a completely different take. (I'm not sure that it's helpful mind you... but it gives me something to chew on).
Wait until you try an OLED computer monitor, that screws with the "higher refresh rate => less ghosting" thought process completely.
Oh yeah I have an Oled LG C4 TV with 120hz refresh rate.
Can't go back to non-oleds.
Agree completely with this.
When I use a desktop display, my pattern is: load page, read content for 10-30 seconds, scroll, repeat.
When I use a phone, the read-time-before-scroll is more like 1-5 seconds due to the much smaller display.
I notice the scrolling blur in both places on 60 Hz displays, but it bothers me way more on a phone because I'm scrolling so much more.
I regularly switch between Android 120hz and iPhone 60 hz. It's bad for maybe 2 or 3 minutes then the brain get used once again to it.
There is nothing groundbreaking about 120hz.
The curse of high standards. I wish I dont notice a lot of things. I wish I can stop thinking about why something that is clearly better hasn't been done.
I would live a much happier life.
I don’t think on this case it’s high standards, my eyes are just unable to really notice the difference.
Crazy. I switch between my work’s M4 MacBook Pro and my personal M3 MacBook Air all the time and I forget that the displays are even different.
I can tell the difference between 120 and 60 just fine and of course prefer better, but it doesn’t bother me.
It’s unfortunate if it bothers you. I have the same reaction to 30Hz.
Same, thankfully its now completely gone in phones. But i like the MBA 13 for its form factor but the screen feels broken to me.
I'm right there with you, 60hz feels like a flip book to me now.
This is such a weird experience for me. On my phone, I instantly notice going back to 60 from 90 hz. But on my computers and handheld consoles, I don't mind, or even notice, at all.
How do you watch movies or TV without throwing up?
Motion blur mitigates the issue to some extent, why 24fps films are watchable.
Major difference is one you're watching something without interacting with it and the other is responding to your action; one you have your gaze relatively still, taking in the entire frame, the other your eyes are tracking an object as you interact with it via some sort of input device.
In tracking motion your eyes/brain can see improved motion resolution (how clear the details are in an object moving across the screen) up to 1000Hz.
Your body & nervous system processing has input lags on the order of 100ms and variance on the order of 10’s of ms though.
But your eyes can track a moving object (like a car, or a ball, or a cursor or text on a scrolling webpage); they don’t stay 100ms behind it.
Distance to screen matters.
Personally I've had concussions and bad screens do make me sick. Even 60hz TVs if I'm sitting somewhat close, particularly for certain content. All the chaos of Dr. Strange / Multiverse was too much for me to watch.
> My ideal MacBook would probably be a MacBook Air, but with the nano-texture display! :)
I agree on the nano-texture display having used one in person for a little bit. It's sort of like an ultra fine matte texture that isn't noticable while using it, but is noticable compared to other devices in the same room. I hope it becomes a more standard option on future devices.
That said, I've used Thinkpads with matte displays and while not as fine, they mostly have the same benefit.
I think my ideal would be a MacBook Air with both the nano-texture and higher 120hz refresh rate the Pro has. With that, I'll trade an extra second of compile time for my rust projects for the smaller form factor.
are rust devs the new vegans?
It's the first matte screen on a MacBook since 2011.
I ran that thing for like 6 years til the replacement for the failed GPU failed again.
More matte screens please!
To be it looked very much like the matte coating on Dell monitors, where bunched up same-color pixels have this "feels like there's a rainbow here but if I focus on it I don't see it anymore" effect. Definitely better than ThinkPad matte, though.
I’d love an air with a high density display.
My mom has an M1 air, and its resolution is not great. Everything looks a bit blurry compared with my 4K Dell XPS my wife’s MacBook Pro m4 display. I guess the air’s native resolution means it has to do fractional scaling.
The m1 air native resolution is 2560x1600 and the 'best for display' default is 1280x800, that's 2x integer scaling. But yeah if you have a different resolution set, it'll be fractional and probably a bit blurry in comparison.
The Air has about 218dppi screen, but your wife might have a non-integer resolution selected.
Yeah the default doesn't do a 1:1 display to pixel ratio.
Just to be pedantic it is integer scaled (from 1440x900 to 2880x1800 but then resampled down to the native resolution of the MBA 2560x1600 via something better than bilinear).
What is going on with the Dells recently?
Dang I was gonna get one with nano texture but the opinion was 50/50 everywhere so I went with the Devil I know
One thing that wasn't mentioned is the max sustained screen brightness for SDR, which is higher on the M4 Pro (1000 nits) compared to the M4 Air or M1 Pro (500 nits).
There’s an awesome app called Vivid which just opens the HDR max brightness. I use it all the time with my M3 Pro when working outside and I believe it also works on earlier models.
There are so many base features that are inexplicably relegated to 3rd party apps. Like a better finder experience. Or keeping screen on. Or NTFS writing.
NTFS writing isn't that inexplicable. NTFS is a proprietary filesystem that isn't at all simple to implement and the ntfs-3g driver got there by reverse engineering. Apple doesn't want to enable something by default that could potentially corrupt the filesystem because Microsoft could be doing something unexpected and undocumented.
Meanwhile if you need widespread compatibility nearly everything supports exFAT and if you need a real filesystem then the Mac and Windows drivers for open source filesystems are less likely to corrupt your data.
I'll take ntfs-3g over the best implementation of exFAT in a heartbeat. Refusing to write to NTFS for reliability purposes, and thereby pushing people onto exFAT, is shooting yourself in the foot.
At which point you're asking why Apple doesn't have default support for something like ext4, which is a decent point.
That would both get you easier compatibility between Mac and Linux and solve the NTFS write issue without any more trouble than it's giving people now because then you'd just install the ext4 driver on the Windows machine instead of the NTFS driver on the Mac.
Is it that easy to use on Windows these days? I should give it a try.
Apple is likely to be in the position to negotiate nrfs documentation access with Microsoft for a clean-room implementation, with NDAs and everything.
My money is on apple not having the will to do thar.
> There are so many base features that are inexplicably relegated to 3rd party apps.
> Like a better finder experience.
> Or keeping screen on.
Do you mind linking or naming which tools you use for those 2 purposes?
Asking out of pure curiosity, as for keeping the screen on, I just use `caffeinate -imdsu` in the terminal. Previously used Amphetamine, but I ended up having some minor issues with it, and I didn't need any of its advanced features (which could definitely be useful to some people, I admit, just not me). I just wanted to have a simple toggle for "keep the device and/or display from sleeping" mode, so I just switched to `caffeinate -imdsu` (which is built-in).
As for Finder, I didn't really feel the need for anything different, but I would gladly try out and potentially switch to something better, if you are willing to recommend your alternative.
Default Folder X is a huge improvement to Finder, specifically open and save windows. It's in SetApp too.
Not op but raycast is for sure an improvement on the stock finder.
https://www.raycast.com/
I use the Finder and Raycast heavily. Raycast is not, and does not sell itself as, a Finder equivalent.
OP: I've tried all the Finder replacements. Path Finder, for example. At the end of the day, I went back to Finder. I always have a single window on screen with the tabs that I use all day. This helps enormously. I show it on YouTube here (direct timestamp link): https://youtu.be/BzJ8j0Q_Ed4?si=VVMD54EJ-XsxkYzm&t=338
You can use Raycast to directly open files. I show that here: https://www.youtube.com/watch?v=yKbtoR2q_Ds&t=482s - still doesn't make it a Finder replacement.
Finder is the number one reason it boggles my mind that people claim macOS as head and shoulders above other OSes "for professionals". Finder is a badly designed child's toy that does nothing at all intuitively and, in fact, actively does things in the most backwards ways possible. It's like taking the worst of Explorer (from Windows XP), and smashing it into the worst of Dolphin or Nautilus; and, to top it off, then hiding any and all remaining useful functionality behind obscure hot keys.
It has been more or less the same as long as I've used it (20 years or so). Familiarity is a plus. It is a pretty simple and straightforward tool. I'm not sure what you might find perplexing about Finder.
Finder has become fine, but when I first switched to Mac, it was hard to believe Finder was so bad compared to XP-era Windows Explorer.
> keeping screen on
`caffeinate -d` in the terminal - it’s built-in
What's crazy is that Vivid app...costs money!
Looks like there's an OSS app that does basically the same thing: https://github.com/starkdmi/BrightXDR
Welcome to the Mac ecosystem. Where basic functionality is gated behind apps that Apple fans will tell you "are lifesavers and totally needed in Windows/Linux/etc)" for $4.99-14.99/piece. And, when they get popular enough, Apple will implement that basic functionality in its OS and silently extinguish those apps.
And that's when they let you modify/use your OS the way you want.
There’s multiple free versions and forcing HDR on isn’t a basic feature by any means.
And yet, it's a simple toggle (sometimes multiple, for specific display flows) in GNOME, KDE, and Windows 10+.
A far as I understand Windows only has a toggle for HDR on vs off, that's not what we're talking about here, this is about forcing the full brightness of HDR always, even outside videos. It's something that manufacturers don't allow for as it reduces display life, it would actually be an anti-feature for a consumer OS to expose as a setting. It'd be like exposing some sort of setting to allow your CPU to go well beyond normal heat limits.
I don't mind that. 3rd party Mac utilities are nice: well designed, explained and do what they're supposed to because someone makes a living of it. I'm happy to pay these prices.
I would personally be afraid of using that in case it causes damage long-term to the screen either due to temperature or power draw or something. Idk if there are significant hardware differences but in this case I would guess there’s a real hardware reason for it?
I've used vivid nearly every day since the week the first m1 MacBook Pro came out, no damage to my screen at all.
People have to pay money to change screen brightness on a Mac?!
I imagine what those custom brightness apps do is not magically increase the brightness, but change the various pixels' brightness in accordance to some method/algorithm such that you see what appears to be brighter whites when placed next to certain other colors.
It's not what is implied by the parent post - where the mac is limiting the brightness only to have the app unlock it.
No, I believe the issue is Apple limits the top half or so of the brightness/backlight level for HDR content only. The apps allow it to be used for normal non-HDR content.
I think it's just a matter of some "I need HDR" syscall.
...I'd have to say that seems like a radical reading of the text.
No; you can adjust screen brightness just fine with the built-in settings, including with the F1 and F2 keys (plus the Fn key if you've got them set that way).
This Vivid app is specifically for extra HDR levels of brightness. I've never had a problem with my M1 or M4 MBPs, either inside or outside, with the built-in brightness levels. (But, to be fair, I don't use it outside a lot.)
It's classic Apple to spend over a decade insisting that that glossy screens were the best option, and then to eventually roll out a matte screen as a "premium" feature with a bunch of marketing around it.
Historically, traditional matte screen finishes exhibited poor optical qualities by scattering ambient light, which tended to wash out colors. This scattering process also affected the light from individual pixels, causing it to refract into neighboring pixels.
This reduced overall image quality and caused pixel-fine details, such as small text, to appear smeary on high-density LCDs. In contrast, well-designed glossy displays provide a superior visual experience by minimizing internal refraction and reflecting ambient light at high angles, which reduces display pollution. Consequently, glossy screens often appear much brighter, blacks appear blacker without being washed out, colors show a higher dynamic range, and small details remain crisper. High-quality glass glossy displays are often easy to use even in full daylight, and reflections are manageable because they are full optical reflections with correct depth, allowing the user to focus on the screen content.
Apple's "nano texture" matte screens were engineered to solve the specific optical problems of traditional matte finishes, the washed-out colors and smeary details. But they cost more to make. The glossy option is still available, and still good.
I used to have a 2006 macbook pro with the matte screen. It was glorious. None of these issues were present or really noticeable. Maybe you'd notice it in lab setting but not irl. Kind of like 120hz and 4k; just useless to most peoples eyes at the distances people actually use these devices. I've only owned matte external monitors as well and again, no issues there.
The glossy era macbooks otoh have been a disaster in comparison imo. Unless your room is pitch black it is so easy to get external reflections. Using it outside sucks, you often see yourself more clearly than the actual contents on the screen. Little piece of dust on the screen you flick off becomes a fingerprint smear. The actual opening of the lid on the new thin bezel models means the top edge is never free of fingerprints. I'm inside right now and this M3 pro is on max brightness setting just to make it you know, usable, inside. I'm not sure if my screen is actually defectively dim or this is just how it is. Outside it is just barely bright enough to make out the screen. Really not much better than my old 2012 non retina model in terms of outdoor viewing which is a bit of a disappointment because the marketing material lead me to believe these new macbooks are extremely bright. I guess for HDR content maybe that is true but not for 99% of use cases.
>I used to have a 2006 macbook pro with the matte screen. It was glorious. None of these issues were present or really noticeable.
They were absolutely noticable. Contrast was crap. I immediately went with glossy with my next MBP around that same period.
I can't go back to the low contrast and washed-out look of matte screens unfortunately. The nano texture isn't terrible but I'd only use it if I had to work with a bright window or other lighting source behind me. If you go to an Apple store you can A/B test glossy vs. nano-texture and glossy wins for me.
OLED glossy on the iPad Pro is even better.
It became more of an issue when retina came out, that's when they stopped non-reflective screen options.
Yeah, what on earth. Go back to one of these old displays, I guarantee you want to gouge your eyes out at how terrible they are. 2006 should put you firmly in 720p land.
120Hz is absolutely a noticeable improvement over 60Hz. I have a 60Hz iPhone and a 120Hz iPhone and the 60Hz one is just annoying to use. Everything feels so choppy.
I believe refresh rate/FPS is one of those things where it doesn't really matter but human eyes get spoiled by the higher standard, making it hard to go back. I never saw issues with 30 FPS until going to 60, etc. Hopefully I never get a glimpse of 120 or 144Hz, which would require me to throw out all existing devices.
I'm not convinced. I have an iphone 14 pro which has a 120 Hz screen. I can absolutely see the difference when scrolling compared to my older iphone 11 or computer screens.
However, I'm typing this on my Dell monitor which only does 60 Hz. It honestly doesn't bother me at all. Sure, when I scroll long pages I see the difference: the text isn't legible. But, in practice, I never read moving text.
However, one thing on which I can't go back is resolution. A 32" 4k screen is the minimum for me. I was thinking about getting a wider screen, but they usually have less vertical resolution than my current one. A 14" MBP is much more comfortable when looking at text all day then my 14" HP with FHD screen. And it's not just because the colors and contrast are better, it's because the text is sharper.
Best take in this thread.
The jump forward doesn't even necessarily feel that huge but the step backward is (annoyingly) noticeable.
I can't tell at all when my mbp is in 120hz or 60hz. I tried to set up a good test too by scrolling really fast while plugging and unplugging the power adapter (which kicks it into high power 120hz or low power 60hz).
One of those things that some people notice, some people don't. I'm definitely in the camp where I feel differences between 120hz and 60hz, but I don't feel 60hz as choppy, and beyond 120hz I can't notice any difference, but others seemingly can. Maybe it's our biology?
I would bet most people would fail a blind test.
Basically everyone who has played videogames on pc will notice the difference. I easily notice a drop from 360Hz to 240Hz.
I also use 60Hz screens just fine, saying that getting used to 120Hz ruins slower displays is being dramatic. You can readjust to 60Hz again within 5 minutes. But I can still instantly tell which is higher refresh rate, at least up to 360Hz.
Videogames also do the input every loop so there's a big difference there. It must be evaluated with a video only.
We're talking about monitors here, which usually have a mouse cursor on it for input. Of course it would be hard to tell between 60 vs 120Hz screens if you used both to play a 30FPS video.
Lots of games don't do input on every loop. Starcraft 2 has 24 hz input.
60 to 120? Generally there are tell tale signs. If I quickly drag a window around it’s clear as day at 120.
Most people who’ve used both 60 and 120 could tell, definitely if a game is running. Unless you’re asking me to distinguish between like 110 and 120, but that’s like asking someone to distinguish between roughly 30 and 32.
North of 120 it gets trickier to notice no matter what IMO.
I can live with 60 but 85+ is where I’m happy.
It's super easy, put your finger on a touchpad and move it fast in circle so that the cursor also moves in circle. As the eye is not that fast, you will see multiple faint mouse cursors images. With 120 Hz there will be twice more cursors than with 60 Hz.
On a perfect display you should see just a faint grey circle.
Another test is moving cursor fast across the white page and tracking it with eyes. On a perfect display it should be perfectly crisp, on my display it blurs and moves in steps.
So basically on a perfect display you can track fast moving things, and when not tracking, they are blurred. On a bad display, things blur when tracking them, and you see several instances otherwise. For example, if you scroll a page with a black box up-down, on a bad display you would see several faint boxes overlayed, and on a perfect display one box with blurred edges.
You could replicate a "perfect display" by analytically implementing motion blurring (which is really just a kind of temporal anti-aliasing) in software. This wouldn't let you track moving objects across the screen without blur, but that's a very niche scenario anyway. Where 120hz really helps you is in slashing total latency from user input to the screen. A 60hz screen adds a max 16.667ms of latency, which is plenty enough to be perceived by the user.
I think it’s more noticeable if you are touch interacting with your screen during a drag. If you are scrolling using the mouse, you might not realize it at all like if you were scrolling with your finger.
4K too, at anything over 15” or so.
I’m always baffled people insist otherwise.
At the distance I look at my TV screen (about 7 feet from the couch) I can't make out the pixels of the 1080p screen. 4k is lost on me. 2020 vision but I guess that is not enough.
Resolution is much less important for video than it is for text and user interfaces.
This is exactly why I went to 4K.
Used to have a 27" 2560x1440 monitor at home. Got a 4K 27" at work, and when I got home, the difference was big enough that I (eventually) decided to upgrade the home monitor.
Unless the screen is right in front of your face, video codecs and their parameters matter more than FHD vs UHD, IMO.
At least to me, with corrected vision, a high quality 1080p video looks better than streaming quality 4k at the same distance.
Compare apples to apples, e.g. gaming, and the difference is glaring.
I’m 3m from my TV and I can absolutely tell 4K from 1080p, but it is indeed subtle.
But a fraction of that distance to my monitor makes even 4K barely good enough. I’d need a much smaller 4K monitor to not notice pixels.
I also have perfect vision in terms of focal length - but it turns out I have astigmatism in opposite axises in both eyes.
Glasses make a huge difference when watching TV, and are the dividing line between being able to tell the difference between 4K and 1080p and not being able to discern any.
I agree with this, but I use a 43" 4K TV as my monitor... which probably isn't what you meant.
I notice it on my 27” monitor. I’ve seen 15” 4K displays and that’s about the limit where I can see the difference.
My eyesight isn’t perfect, either.
I have the last gen 27” 5k iMac with nano texture as my primary monitor these days and you can immediately tell the difference between image quality, compared to a glossy MacBook pro. Don’t get me wrong, it’s by far the best quality matte finish I’ve ever seen and I would buy it again, because it works great in a room with south-facing windows, but it definitely affects the overall image quality noticeably.
I still have my 2011 17" MacBook Pro, built to order with pretty much every available option available at the time, including the matte screen.
While it serves a useful purpose by diffusing unavoidable point light sources in uncontrolled environments, it's honestly not much of an improvement over its glossy contemporaries in sunlight and other brightly-lit environments, as diffusing already diffuse reflections has little effect.
I have a 2013 MBP retina with glossy screen and a 2020 HP with a matte screen.
What I've found, is that inside, the HP is much better at handling reflections. However, outside, the screen gets washed out and is next to unusable. Whereas on the MBP, I can usually find an angle where reflections don't bother me and I can spend hours using it.
Your 2006 MacBook was pre-retina, a.k.a. High-resolution, displays though. Any kind of smearing effect probably improved the perception of the image because it masked the very visible pixels in the LCD
I also was matte in 06, and had that machine for 9 years (until it was stolen :/). Only option was glossy for my replacement, I was devastated. A few machines later now, I can’t imagine going back.
The 2006 would probably have had 1080ish resolution. I think the GP's point is that at higher resolutions, matte has tended to have the issues they cited.
I am with you in preferring matte. For me, mostly because of reflections on glossy screens.
Even at ~100 dpi, the grainy character of matte coatings from that era was noticeable; my 2006 iMac and a Dell Ultrasharp from a few years later were both unmistakably grainy in a way that glossy displays are not. At the time, the matte coatings were an acceptable tradeoff and the best overall choice for many users and usage scenarios. But I can imagine they would have been quite problematic when we jumped to 200+ dpi.
We have different eyes and different purposes, I think.
That's what Lunar is for. Just bump up the brightness to HDR levels. Helps a lot with the glare, but will take a bite out of the battery life.
For professional graphic designers, cinematographers, photographers, and illustrators these subtleties in the screen is a big deal.
To each their own but I have a matte M4 Pro and I don't like it, and the screen is noticeably worse than my glossy M2 Pro.
There's a graininess to the screen that makes it feel a little worse at all times, meanwhile I never had a problem in daylight just cranking brightness into the XDR range using Lunar.
It's especially noticeable on light UIs, where empty space gets an RGB "sparkle" to it. I noticed the same thing when picking out my XDR years ago, so it seems like they never figured out how to solve it.
The difference between matte and glossy displays in regards to their contrast and clarity is absolutely noticeable to the naked eye.
There is a large visual difference between 60hz/120-144hz.
> Unless your room is pitch black it is so easy to get external reflections
This is nearly my preferred setup, only I have wall lights on the wall behind the monitors so it's not truly a dark room (which is horrible for your eyes). No over head lights allowed on while I'm at the keyboard.
Just make sure to not wear glasses or white clothes.
Good for you! Not as good for a typical office though.
Well, I WFH, so of course. Yet another reason RTO is a no go
Both 4k and 120hz were very noticeable improvements imo.
> High-quality glass glossy displays are often easy to use even in full daylight…
Not my experience in lit environments. Looking at a mirror-like surface trying to distinguish content from reflections is exhausting.
Unless I blast my eyes at full brightness which is more exhausting.
To each their own. Matte screens always have a massive smudge in bright light and look terrible and grainy in the dark. I can’t stand them.
If all that is true, why do professional photography monitors pretty much exclusively have matte finishes. Same for monitor used by video, CAD or 3d professionals.
You guys need to stop reading apple advertisement material and take it for gospel just because it has some fancy scientific words in it.
Matte is always being the fancier option in Photography paper, glossy photograph just looks cheap.
Interesting, given that in the older days of analog dark room development, you had to use a special kind of paper and heat-press it against a polished surface when drying to get a glossy photo.
I always thought matte photos were more readable, but glossy used to be more wow and have “deeper blacks”.
> High-quality glass glossy displays are often easy to use even in full daylight,
I guess Apple cheaped out on their glossy displays, because I definitely didn't care for mine in full daylight
Glossy vs matte has started to matter less as the peak brightness goes up.
When your screen can do 1,600 nits, daylight isn't as much of a problem
I'd rather not blow my battery budget on fighting the sun for visibility.
I tend to do outdoor things outdoors, so occasionally cranking up brightness is not an issue.
I'd much rather do that than to have a granier screen with worse viewing angles all the time I'm not in direct sunlight, so next time around I'll be back on glossy.
Yeah this m3 pro isn't really doing 1600 nits. Marginally brighter than my 2012.
To get to actual 1600 nits you need to use scripts.
https://github.com/SerjoschDuering/macbook_1600nits
Not sure the impacts to display health or battery running the screen full bore like this.
I use Lunar and have used it on my Pro Display XDR and every MBP with XDR I've owned with 0 issues.
All of what you say is kind of sort of true in the sense that, if you are in a room with lots of off-axis light hitting your screen and darkness behind you and you yourself are not brightly lit, then the glossy screen is better. And the glossy screen is certainly sharper.
But if there’s a window or something bright behind you, the specular reflection from the glossy and generally not anti reflective coated screen can be so bright and so full of high frequency details that it almost completely obscures the image.
And since I might be trying to work involving text in a cafe as opposed to doing detailed artistic work in a studio, I would much prefer the matte surface.
Do you prefer glossy paper work? glossy book pages? glossy construction documents? The preference for a non-reflective surface for the relaying of dense information has been established for decades.
You know what's glossy? Movie posters and postcards.
Paper, books, and construction documents all use reflected and not refracted light.
ooh, my feathers were a bit ruffled (for reasons unrelated) when I wrote the above.
I still say for comfortable all day viewing and productivity, there is no comparison. Glossy does have more pop on a phone or watching movies in the dark, but I'd go blind doing that all day every day..
non-reflective surfaces you cite have pigments on TOP. screens have depth causing parallax and light spreading. Your point would be valid if screens were paper-thin and image pixels came out the very surface
You'd need a jewelers loupe to appreciate parallax and spreading. Not a real problem in general use.
i use a matte screen protector on my iphone. without it, i can see pixels. with it, i cannot. no loupe, just my nearsighted eyes
You can see actual pixels on a retina iphone? That is remarkable eyesight. I could do it on old non retina iphones but not on retina models.
Kind of a cool thing about being nearsighted. Without glasses, I can get very close to things and still focus on them, i get to see very small details.
Is there any write up on the tech behind nano texture? What makes them better than traditional matte screens?
You make it sound like what they, according to you, tried to do was a success. One look at nano texture screen is enough for a resounding no.
Somebody drank its portion of cool aid for sure. There is that little detail that glossy screens needed absolutely perfect conditions in front of them to not reflect literally whole world, making resulting visuals often subpar to matte. I have never, ever been in work conditions in past 20 years that didn't manifest this in annoying and distracting way.
I haven't seen a single display that ever overcame that properly for long term work. Sure, phones use it but they increased luminosity to absurd level to be readable, not a solution I prefer for daily long work.
I admit there are corner cases of pro graphics where it made sense (with corresponding changes to environment) but I am not discussing this here.
I have a feeling that you've never actually seen a matte screen.
These AI comments suck. I mean sure. It’s probably true. But the pollution of our social interactions with slop is so icky.
I receive these highly polished emails from people and am just annoyed. Do you expect me to answer your robot?!
Maybe there needs to be a bad style minimum for a forum in the future. Only human imperfections allowed.
Ok. Of topic maybe.
I love the Nano texture displays. And the glossy glass ones were also great and the best ones out there.
Hi! I don't think I have any way of convincing you, but I'm not an AI. Also, randomly accusing people of being an AI is fairly offensive, in case that's not obvious.
It is well written and that makes you think it was written by AI? AI doesn't write as well as that anyway.
Sounds like Apple marketing wankery. I have a matte high density LCD from 2013 (Lenovo) that looks great. Does Apple even make the displays? What exactly are they "engineering" here?
> What exactly are they "engineering" here?
The coatings, which do matter quite a bit when you are optimising for some durability/optical quality tradeoff.
Glass covers make screens more durable, but imply internal and external reflections. Laminated screens on glass panes solves the internal reflections and improve transmission, but do not help with glare and external reflections. Those can be improved by texturing the glass, but at the cost of diffraction and smearing, leading to a decrease in effective resolution. Unless the texture becomes small enough, but then you need it to be durable enough to avoid being wiped or damaged by things that might come into contact with the screen.
It turns out that there is a lot more than the bottom layers that matter in a display. You can see all these problems being solved in succession when looking at the evolution of Apple’s displays over the years (and others’, but it is much easier to find information about the good and bad sides of any Apple product). It’s fascinating, actually.
[edit] add the issue of oils on the human skin and you have do deal with oleophobic coatings for touch screens, which is another very important factor to consider. In addition to how the touch sensors are integrated.
If anything, Apple was right back then. Glossy has so many benefits for the places where you’d use a computer, it’s not even close. Having the option to pay premium for those few that work in environments where matte helps them makes sense. I’d pay money for my display to not be matte.
Apple was actually late to the glossy display party. HP and Dell moved to them a few years before Apple. I don't think Apple was "insisting" on them, but rather following an industry trend that they were late to.
I wonder if they will (re)introduce premium keyboards with sculpted keys that self-center your fingers someday. magsafe coming back was nice, maybe more extra ports?
MagSafe + SD card reader + headphone jack + USB-C/TB4 only ports is fine by me. In 2025, I'm well past needing USB-C to USB-A dongles. We've had since what 2015/16 to start the conversion to C only.
My car from 2023 still came with USB-A port. No-so cheap USB camera that I recently bought came with USB-A port.
The camera came with a USB-A port, or simply provided a cable that had a USB-A end? I've never seen a camera with an A port
It was a cable with USB-A end. The cable cannot be detached from the camera.
As someone who buys and likes Apple stuff, I agree, it's a signature move from them.
They are really good at selling a small quantitative improvement that causes them to start using something, as a new type of thing going from impossible to possible. As if the tech didn’t just didn’t exist before Apple started using it.
It is probably a pretty good screen, though.
I don’t really like Apple overall. But, to some extent, it’s like… well, maybe that’s a good way of selling incremental engineering improvements.
i recently worked with a macbook pro and it caused uncomfortable feelings of eyestrain. i had some app that was supposed to disable the temporal dithering but i'm not sure if it helped. i'm curious if there's anyone else on here like me who has experienced eyestrain with macbooks where the nano texture display has helped.
> It's classic Apple to spend over a decade insisting that that glossy screens were the best option
I don't recall Apple ever "insisting" anything about glossy vs. matte. They simply eliminated the matte option without comment, and finally brought it back many years later.
If you have a reference to a public statement from Apple defending the elimination of the matte option, I'd like to see it.
To be clear, I've been complaining about glossy Macs ever since matte was eliminated, and I too purchased an M4 MacBook Pro soon after it was available.
> “…featuring the Intel Core Duo processor and a gorgeous new 13-inch glossy widescreen display…”
> “…the MacBook provides incredibly crisp images with richer colors, deeper blacks and significantly greater contrast…”
This is positioning for glossy being superior.
https://www.apple.com/newsroom/2006/05/16Apple-Unveils-New-M...
It's indisputable that glossy displays have advantages over matte displays. It's also indisputable that matte displays have advantages over glossy displays, most importantly, fewer reflections of ambient light. The choice is a tradeoff.
A sentence in a PR that highlights an indisputable advantage of a glossy display does not position glossy as being superior overall but merely superior in the respects mentioned, which is not controversial.
Moreover, Apple continued to offer a matte display in the MacBook Pro for years after that PR, so why would they sell an "inferior" option?
In one quote they used glossy to describe it. How does that mean they said that glossiness made it better?
The other quote is just a list of ways in which the screen is better.
It is YOU that is conflating these and saying that this list of improvements is down to glossiness, not Apple.
The "matte" options also are totally different approaches, different quality levels. They're not the same product.
> They simply eliminated the matte option without comment, and finally brought it back many years later.
Wasn’t the matte option that disappeared just then removing the glass in front of the screen? I seem to remember that (my MBP from that time was glossy).
The nano textured coating they are using now is quite complex and I am not quite sure it was applicable at such scales cheaply enough back in 2015.
The PowerBook and the first MacBook Pro were only matte.
A glossy option was introduced in 2006, but the MacBook Pro was still matte by default.
In 2008, the MacBook Pro case was redesigned, and then the display situation changed significantly.
I don't think this is exactly accurate. The matte was a ~$80 upgrade option after they released the glossy. I definitely preferred the matte screens and still do. For coding reducing glare in uncontrolled environments is way more important to me than color fidelity, but to each their own.
It's certainly on brand for Apple to face widespread criticism in the past for having matte screens as the default (computer magazines of the day found that matte finishes made screens too dim) only to face renewed criticism for dropping the thing they were previously criticized for.
It’s classic Apple commenter not know about Apple. They offered matte display upgrades to the MacBook Pro almost 20 years ago. The current glossy black display only became a product line wide choice with the retina displays in 2012, likely because they didn’t prioritize getting an appropriate matte glass finish on the retina screens due to low demand.
I can make the same argument about you. Matte display was the standard prior to Unibody MacBook Pros in 2008.
Glossy was an available option, but not the product line wide choice.
The top of the line Late 2008 MacBook Pro (not Unibody) included: > An antiglare CCFL-backlit 17" widescreen 1680x1050 active-matrix display (a glossy display was offered via build-to-order at no extra cost, and a higher resolution LED-backlit 1920x1200 display also was offered for an extra US$100).
https://everymac.com/systems/apple/macbook_pro/specs/macbook...
Are you an Apple commenter?
Downvoted for the unhelpful first sentence.
An frequently overlooked point is the display brightness. The pro models offer 1600 nits peak brightness, which makes these good units for looking at HDR content, especially if you like to take photos or edit videos. Meanwhile the Air maxes out at 500 nits, so the effect and contrast is drastically reduced for those models.
Not just that but you can use Brightentosh to force it on.
I live in a sunny place with big windows and basically use it all day every day. When it turns off my screen feels broken I so prefer the brightness.
Hi thank you so much ! I live in the tropics and often works outdoor and THIS is a lifesaver !!
Thanks again!
https://www.brightintosh.de/
Normal content is still limited to 500 nits, and these being mini-LED displays, contrast is already infinite.
Unless you’re making Instagram content, very few photographers use HDR. Everything else will look the same on both screens.
Normal content is 1000 nits, peak is 1600 nits.
Contrast is significantly poorer on the Air display, and HDR is already in your own photos if you have a modern smartphone, so the idea that it’s niche or irrelevant is a naive take.
The perceptual difference between sdr and hdr isn’t a minor bump, it is conspicuous and driver of realism.
If one cares about the refresh rate of their screen, then they’d trivially notice the improvement that high nit displays provide.
> and these being mini-LED displays, contrast is already infinite.
I think you may have mixed up mini-LED backlighting with OLED and microLED displays. mini-LED backlights merely allow for better local dimming of the backlight behind an LCD, but the number of independently variable backlight zones is still orders of magnitude smaller than the number of pixels. Over short distances, an LCD with local dimming is still susceptible to all of the contrast-limiting downsides of an LCD with a uniform static backlight (and local dimming brings new challenges of its own).
OLED is the mainstream display technology where individual pixels directly emit their own light, so you can truly have a completely black pixel next to a lit pixel. But there are still layers and coatings between the OLED and the user, so infinite contrast isn't actually achievable.
microLED is an unsuccessful technology to provide the benefits of OLED without as many of the downsides (primarily, the uneven aging). But nobody has managed to make large microLED displays economically yet, and it doesn't look like the tech will be going mainstream anytime soon.
> but the number of independently variable backlight zones is still orders of magnitude smaller than the number of pixels
The appearance of a lone mouse cursor on a black screen in the dark is mildly amusing for exactly this reason. You can watch as the ghostly halo of light follows it around the screen as you move the cursor.
I'll upgrade my machine when they put an OLED display in it.
> The nano texture display is great at reducing reflections. I could immediately see the difference when placing two laptops side by side: The bright Apple Store lights showed up very prominently on the normal display, and were almost not visible at all on the nano texture display.
This is a quiet boon for those who enjoy working outdoors but find the sun/brightness a problem.
20 years ago I bought a G3 iBook because the hardware was lovely and the system was supported perfectly by stock Debian woody. (Hands up if you remember having to bless your laptop with “holy penguin pee”, part of the output of the yaboot bootloader used in PowerPC systems!)
Times changed and the best hardware for me right now is a Dell XPS from the model lines a few years back that looked like an aluminum sandwich with a black plastic filling. These machines are fantastic but (1) no OLED, (2) now high speed refresh rate, and (3) the keyboard isn’t great.
Could this modern Apple hardware bring me back to Free OS on pretty hardware, or is there something else I should try?
Asahi (Linux) lags quite far behind the latest Apple hardware release. If you want the Linux experience on Apple hardware, I think the best move is full-screen VM. Performance of that is more than good enough, but it does mean you are running a full non-free software stack to get to your free software VM.
I bought one of those iBooks for Debian linux, but I found the resolution was a bit small for X. At the time, I had a thing for non-intel architectures. Prior to that, I had done a lot of work packaging up Debian for Sparc machines. I had access to a wide variety of Sun workstations at my job as a sysadmin at a university.
Incredible hardware. Love that I can also run local llms on mine. https://github.com/Aider-AI/aider/issues/4526
But are these llms worth their salt?
They're not unless you curve the grading because they're running locally.
Which some people do, but I don't think the average person asking this question does (and I don't)
With 128GB of memory they can have real world use cases. But they won’t be as good as SoTA hosted models.
If you bought a fully-featured computer that supports compute shaders and didn't run local LLMs, you should be protesting in the street.
Can't you run small LLMs on like... a Macbook air M1? Some models are under 1B weights, they will be almost useless but I imagine you could run them on anything from the last 10 years.
But yeah if you wanna run 600B+ weights models your gonna need an insane setup to run it locally.
I run qwen models on MBA M4 16 Gb and MBP M2 Max 32 Gb, MBA is able to handle models in accordance with its vram memory capacity (with external cooling), e.g. qwen3 embedding 8B (not 1B!) but inference is 4x-6x times slower than on mbp. I suspect weaker SoC
Anyway, Apple SoC in M series is a huge leverage thanks to shared memory: VRAM size == RAM size so if you buy M chip with 128+ Gb memory, you’re pretty much able to run SOTA models locally, and price is significantly lower than AI GPU cards
They "run" in the most technical sense, yes. But they're unusably slow.
It's always interesting to see users have somewhat strong opinions over fan vs fanless. I could never go Macbook Air again because I've been to hotter climates and do things beyond just using a browser and invariably the keyboard gets too warm for my fingertips. I need the MBPs fans and Mac Fan Control, noise be dammed.
How much of a difference would I see in compute between an M2 and M4 for example? Assuming it’s the same RAM. Did they also make the gpu and neural engine that much better between the two?
>My ideal MacBook would probably be a MacBook Air, but with the nano-texture display! :)"
The MBA should also have the LCD display with 120Hz and brightness from MBP, Vapour Chamber cooling from iPhone Air, and better keyboard.
Heh, matte; finally. Gloss is such a PITA if you can't control what's behind you, which ironically is a pretty common dev-with-macbook experience. Walking around to different parts of the office. Off-sites. Etc.
I've only purchased matte screen laptops because I only use them for travel. Lenovo pretty much.
Also prefer semi-gloss for my monitors as I work in well lit daylight conditions if I can help it. There have been very high quality semi-gloss monitors for ages now.
I do not like the Apple Nano Texture. 5% of the time it really helps but 100% of the time it just reduces the picture fidelity somehow. When doing visual tasks like video editing, it is just not good.
Is it possible to install previous macOS version on newest macbook model? I see people having terrible experience with macOS Tahoe yet I am considering purchasing a macbook..
no this is not possible because apple stops signing older versions soon after they release the latest.
No, that's a separate issue. You can upgrade a M4 or earlier machine from 15.6 to 15.7 even today, despite 26.0 being out for a while, so Apple's still signing a 15.x release at the same time as they're offering 26.x releases. (You likely won't be able to downgrade from 15.7 back to 15.6.)
Downgrading a M5 machine to 15.x would be impossible not because of a signing issue but because Apple never released a 15.x build that supported M5 hardware.
> I don’t use this computer for serious work.
Next.
Why is this notion that basically only opinions on stuff that you've used in a work capacity are valid so widespread here?
I also went for the fantastic nano texture display on my M4, after having glossy my M1. Very happy with the decision as I use the laptop in brightly lit enviroments so appreciate fewer reflections. Going back to a glossy display is a shock.
I was on the fence for same reason - should I get the nano display? I opted for the 15" MBA, and the display has been great. Way better than my 2019 Macbook Pro. I've had zero issues with glare, but I'm also in an office environment during the day and use it at night when home.
You won't notice 8ms difference in input lag
I’ve been swapping back and forth between a MacBook Pro and a Linux workstation lately. The input latency difference is insane - macOS is sooo much worse than Linux. It’s gotten to the point that I’m porting code to Linux just so I don’t have to use my editor from macOS.
I don’t know how many milliseconds the difference is, but going back and forth it’s so obvious to me that it’s painful.
lots of people can notice that. my last job involved meticulously timing our software's input-tp-display latency, testing viewers' responses to it, and fighting for each and every ms we should shave off of it.
For my sins, I have recently been called upon to cold boot and then provision a few dozen Samsung tablets by hand. The "laggy Lagdroid piece of lagshit" pasta has been repeated a lot. I swear to God it just ignores ten percent of touch events if it's doing anything in the background.
As a seasoned gamer, and one time world record holder, I absolutely can notice 8ms of lag.
Anyone can notice an entire frame of input lag.
The question is more whether it’ll bother you.
I have 165Hz monitors. Software feels noticeably more snappy.
Couldn’t be more wrong.
agree
Musicians can feel latencies as low as 1ms.
Apple is designing pro gear for its target audience.
Fun fact, 1ms is the approximately the amount of time it takes for sound to travel 1 foot. Do musicians move all their speakers to be within one foot of their ears? Do people in a band notice a difference if they're not standing within 1 foot of their partners? No, they don't.
Do you have a source for that? I saw a study a short while ago showing the “just noticeable difference” for audio latency was best case around 26ms.
https://dl.acm.org/doi/fullHtml/10.1145/3678299.3678331
I definitely notice the difference between 10 ms and 26 ms. 26 ms already feel sluggish when playing drums, guitars or keyboard instruments. But there is no way anyone can feel a difference of 1 ms.
That’s audio latency, not musicians doing music. In my experience if you have two musicians that are supposed to be playing unison, 5-6 ms is enough to feel “off”
It depends on the frequency. At higher frequencies, the ear is capable of higher time precision. It's why a snare pops and a bass drum blooms.
The study wasn’t conducted with musicians making music.
I highly doubt anyone notices 1ms latency. I might believe rare people can notice 10ms.
Anecdotically, 7ms vs 3ms latency is felt as weirdly heavy action when playing midi keyboard. It's not felt as latency, but it's felt. And I bet the difference could be reliably established in double-blind testing (3 samples, find an outlier).
1ms seems less believable, but I wouldn't be surprised, if some people could notice that too.
Again I have to point to this Microsoft Research Video.
https://www.youtube.com/watch?v=vOvQCPLkPt4
Fantastic video. QED.
"My ideal MacBook would probably be a MacBook Air, but with the nano-texture display! :)"
Mine as well. What is the likelihood this will happen?
I have a hunch it will not and they will either scrap the nano texture completely or keep it as differentiator for the Pro line, but I am curious what others think.
Mine too, and I bought an air in the last generation and I barely use it because I thought the 60hz display would be ok, but I've been living with 120's everywhere for long enough the 60hz is actually horrible to use now. First world problems for sure, but it's enough that I literally don't use the machine.
I’ve used MBP for many many years, but recently bought an MB Air. I slightly miss the extra ports. I love how much lighter it is. I never notice a speed difference. I’m always ssh’d into a Linux box if crunching any real data, and for UI stuff the CPU doesn’t need a fan at all. Definitely gonna stick with MB Air.
We used to sell conversion kits to shoehorn a pixel qi display into the thinkpad x230. Since apple has put in 1,000nit displays on the pros, we don't bother anymore. The nano texture sold me and it performs wonderfully outdoors. I hate giving apple money but here I am.
Honestly I hate giving money to Lenovo, they're one of the worst companies I've had to deal with at least when it comes to support.
+1 to that. Simply horrendous post-purchase support. Company representatives on all levels, from a simple technician to head of Linux support department, will be lying straight in your face, just to scam a few thousands bucks out of you.
But their keyboards are still the best, and trackpoint is unmatched. As soon as System76 or Framework or any other vendor offer that, I'm giving them my money.
It's because Apple sucks the least. They still suck, though. They could build decent computers that are upgradeable, but they refuse because they want your $$$$ in large amounts.
I may have to check out the new nano display. The old matte display was really a superior choice to the glossy screens of the past several years.
> (When I chose the new laptop, Apple’s M4 chips were current. By now, they have released the first devices with M5 chips.)
Does anyone have any feedback on the new M5 models?
I upgraded from M4 to M5 MBP because I broke my M4's screen and so my company ordered a replacement M5 while the M4 is being repaired. I can't really notice a difference at all. It's an absolute work horse, but so was the M4. I _did_ spring for the nano texture display this time around, and that is definitely nice (but nothing to do with the M5)
Do you think you’ll have any regrets about the nano texture display?
I was torn between nano and regular glass, but opted for the regular glass.
I have the nano-texture display on my M4. At this point, I don't think I can go back to standard glass. For text work, I find there are no downsides. If you work more with color and detailed art, I think that's the only case where you need to put extra thought into it. Otherwise get it
No, I love it. I had non-matte glass screens in my MacBooks since 2012 and I didn't realize how much better it is to no longer see lights reflected in there all the time.
I got the nanotexture on my current work M4 MBP—it doesn't completely eliminate reflected light, but it diffuses it a lot. If I were in a dark room with a light source positioned perfectly to reflect off my screen in my face, I would probably still have trouble with it, but in general I don't need to reposition the screen to avoid glare nearly as much.
I would say it's worth the extra, what, $200 or so? on the price of the M4 MBP. If it were much more expensive, I would be less sure.
It’s often much more than $200 as the base models can be had for huge discounts, like $450 off retail, but the second you check the nano texture option, you lose the discount and you tack on the extra $200. So it’s often closer to $700 in some cases.
I hate to say it but it's totally worth it. Direct sunlight incredible.
I just upgraded from an M1 to an M5 a couple days ago.
It is rather shocking how much faster everything feels given I didn't think my old macbook pro was slow. While I expected xcode builds to be faster (and they are), I was a bit shocked when opening a new firefox tab was instantaneous since I hadn't noticed it wasn't before.
Another thing I didn't expect is that the new speakers have noticeably more bass and can get quite a bit louder.
I didn't get the nano-textured display, because having to adjust the display angle to get colors to render correctly is more annoying than having to do it for glare (I don't work in a high-glare environment).
How is Apple's nano-textured display different from ThinkPad's famed matte display?
funny i was recently picking between a glossy and nano texture screen and came to the opposite conclusion — the glossy screen’s image was so much more crisp, and i didn’t really see much difference in terms of reflection
I have the M4 Max. The fans never really come on unless I launch something that maxes out the GPUs, which I rarely do. I do have some software projects that use all CPUs and maxes those out while they build (all 14 of them). The fans stay silent.
This is, by far, the fastest machine I've ever had. My previous laptop was a more modest M1 mac book pro. And before that I was on a cheapo intel i5 Samsung laptop - a stop gap solution after my last intel mac died when a loose keyboard key destroyed the screen (yep the generation with the crappy keyboards, worst mac I've ever owned). That intel was of course pathetic and shit. I wasn't expecting much and it disappointed me despite that. The M1 was about 3x faster. The M4 Max is a beast. In terms of build speeds, the i5 was unusable while building and would take 15 minutes. The M1 got it down to 5 minutes (10 CPU cores that are faster than the 4 intel ones). But it didn't have enough memory so swapping slowed it down a bit. The M4 max builds stuff in around 30 seconds. No more swapping and the 14 cores are quite a bit faster than the M1 ones. Same project (but of course with a few years of development). We have more tests now, not fewer.
Otherwise it's a great laptop. Keyboard is fine. Touchpad is best in class in the industry (everything else is pathetically mediocre in comparison; it's not even close), the screen is best in class as well (contrast, colors, resolution, everything). And Apple learned it's lesson when it comes to keyboards. Most windows/linux laptops I'm aware off are a compromise between heating/cooling, lousy input and output devices, performance, design, screen quality, etc. Apple nails all of those things. Nobody else does.
High end Macs are not cheap. But for professionals it's a minor expense. If you lease a car for getting your ass to work every morning, you are probably spending 2-3x more at least than what this would cost you. And the whole point of getting to work is to open your laptop and earn a living with it. It's more important than the damn car. It's what pays for that car. I spend less than what used to be 1 hour of my freelance rate per month on this absolute monster. Maybe it's 2 hours for you if you just got started. That's still nothing on 160ish billable hours per month. Employers tend to be less enlightened of course. But if it's your choice, don't be frugal and buy the laptop you need. If a simple browser is all you need, of course get something decent looking like a mac book air or whatever. But otherwise, get the best you can afford. I've compromised once with that Samsung. I did not enjoy that.
The part about noticing web pages loading (at most) 8ms faster due to the display is total nonsense. Many can notice the difference between 60 and 120Hz when scrolling, but definitely not for a page load. That’s less than 1/10th of the blink of an eye.
If page load seems noticeably faster, it’s far more likely that it’s simply a faster machine. Or imaginary.
I still have a 2019 MacBook Pro with the non-butterfly keyboard and escape key (unfortunately still the Touch Bar).
It’s still a great laptop except the battery lasts maybe 75 mins. I just keep it plugged in but despite the fact it’s 6 years old I don’t notice any problems with it.
I’m tempted to buy an M4 laptop just because it’s “new” and “faster” but then I ask myself Why? It’s the same thing with my iPhone. Until my laptop dies or there is something functional that I can’t do with my old laptop I’m going to keep using it.
I have an M1 air that still lasts 7-8 hours on one charge. It's very different than the Intel battery life which I had 5 or 7 machines over the years.
depends on use, I had the same laptop but the speed increase when I upgraded to an M3 was easily worth it
I have done real work, using a computer 10+ hours a day on every ecosystem, Windows, Linux, Mac. I've used each for ~10 years a piece.
My most recent laptop died and it really showed me what I appreciate in a laptop, performance, build quality, lightweight, good battery, low noise, good ergonomics.
I was sick of the recent overheating generation of pc laptops that don't last more than a couple years under my usage.
As a result I decided to try to switch back to a macbook after a decade hiatus.
The hardware is good but the software is absolute garbage. Trialing it for a week the amount of bullshit that is MacOS was enough, and Asahi wasn't there yet either. Instead I decided to get an AMD framework laptop.
Best decision ever.
I have a laptop that's got great quality, can be upgraded without paying a $5k tax, can replace the keyboard for $100 instead of $700, it works with me rather than against me and my wallet.
Which one did you buy? I’m also considering leaving mac just because of how slow and battery intensive the new macOS tahoe is.
https://frame.work/gr/en/products/laptop13-diy-amd-ai300/con... with the AI 7 350 because I was concerned on heat but given the choice to buy again I'd go with the HX370.
Same experience. I cannot consider any screen that does not have the nano texture coating. It is exceptional and a huge improvement. To the point that I actually prefer a tester Samsung Galaxy S25 Ultra over Apple’s own iPhone display.
I couldn't really trust the author of the review after he established his preference for "quiet computers" having no cooling slots or whatever he called them. Okay, fine, you're placing aesthetics above actual performance, then. The Pro laptops are the only ones viable for any really hardcore work because if you push the Air too hard it is going to just slow down in order to stay cool and that's not what you want if you are doing graphics work or in my case, I was running a bunch of containers in K8s. I never bought an Air because it was too similar to an iPad.
The thing that mostly irks me about Apple these days is soldered in RAM and non-upgradeable storage. Apple is still the best thing going for doing most pro development work, but it's just so irritating that they shit on us like this. I did buy an M4 Mini and expanded it some. My 2019 MB Pro is siting here on the desk, mostly unused these days. The Intel Macs are basically dead now--still great computers, but no longer desirable. My daughter is doing Graphic Arts in college and is using another 2019 Pro for that. I've used Macs continuously since at least 2014.
>The thing that mostly irks me about Apple these days is soldered in RAM and non-upgradeable storage.
Isn't the 'soldered-in' RAM and storage fundamental to the M-series architecture? It's not like there's a board with individual chips sitting in it for the RAM and storage, that could potentially have been 'popped out' if they weren't soldered in. It's all one giant 'chip' now.
There are separate chips.
But just like Strix Halo, they have to be soldered. There’s no way to reach the signal integrity required with connectors.
No, M series is a system on chip (SoC), that’s why it’s able to run local LLM models in a range impossible for other laptop brands: VRAM == RAM, unified shared memory at max speed for both CPU and GPU
Strix Halo has the same unified RAM with no separation.
Sadly it’s not in many laptops, probably the easiest way to obtain it is in the Framework Desktop or a mini pc.
I've heard many people saying CAMM2 solves this.
We're still waiting to see any CAMM-style memory module show up in a mass market product at any speed, instead of merely getting press coverage where the number of articles written seems to outnumber the number of laptops actually built and shipped. But even if you are willing to take the examples thus far seriously as real products, they haven't come close to matching the speed of soldered LPDDR.
I was considering an LPCAMM2-fitted Thinkpad. I was eyeing to buy one with less memory and then buy a 96GB module to upgrade it. However, the module was nowhere to be found in stock, and where it was found, it was priced almost like the whole laptop.
CAMM is still less effective than in-package RAM bundled with your CPU. The Framework folks looked into using CAMM for their recent AMD APU-based desktop and it was a no go.
After 18 years of Mac-abstinence, I just bought a MacBook Air and realized there is apparently no way to change the App Store language without changing region and payment method. WTF? That seems like the most basic thing one could imagine. What has happened to Apple?
I was able to switch the App Store language from English to Spanish by changing my primary language in System Settings > Language & Region > Preferred Languages.
It didn't require me to switch my region or payment method.
That seems like classic Apple, really.
Why did you think Apple was user friendly or flexible...it's the Apple way or the highway. Most only stick around because of the currently superior hardware
[flagged]
[flagged]
macOS does not have auto update. In fact it doesn't bother you with any updates which lead to me behind patches behind because I was accustomed to Windows nagging me for updates every week.
> In fact it doesn't bother you with any updates
Patently false on modern MacOS. I get a reminder about Tahoe every week or two. Plus a persistent red "1" dot in the Settings app that you can't dismiss. And a huge info/advert panel in the 'Software Update' section of Settings about Tahoe, that you can't dismiss.
Its usually just a persistent red dot on system settings and the menu that there is an update.
Pro tip: remove 'Settings' from the dock, create a shortcut to the 'Settings' app, and put that in the dock.
Now you just have an annoying tiny black arrow instead of a red dot.
mine seems to be doing just that pretty religiously.
how do you avoid the nagging ?
The nagging might be enabled by the IT support of the company you work in. Mine is also not nagging but the company one used to do it quite often
It nags by default. There is a plist setting to turn it off though.
please do let the mere mortals and linux refugees know ! thank you !
i'll never understand picky preferences about monitors... i still use an LG flatron wide that's old enough to vote... and when i slack at the apple store, it's not like i notice some life-or-death difference. a monitor is a monitor.
ok, i guess for graphic designers it might matter more?
Or people who read text.
Some old LCD displays were quite crisp. Sure, you can see individual pixels. The mouse tail has a clear zig-zag. But I find these nice on the eyes in their own way. I suspect because eyes autofocus more easily.
New super high-res displays are also nice on my eyes. The displays in between, those from the last decade or so, have been hit or miss for me.
that's me, and it really doesn't matter
why is it getting hot?
i noticed my ola macbook pro was connected to my router even when it was sleeping.. probably sending some private info periodically to apple and cia
If you'd like to change that, you can go to System Settings → Battery → Options → Wake for Network Access
Or just search for "Power Nap" (what it used to be called). They usually wake up intermittently for Time Machine backups, wake-on-lane and other stuff.
I have mine set to `NEVER` [wake for network access] and yet it still makes DNS requests often while asleep.
Curiously, it is able to maintain network connection even through the 1/4" steel of the safe it's stored within. The older Intel MBP doesn't and cannot.
I have done this, yet every now and then my macbook still wants to connect to my bluetooth headphones from my backpack.