This is something I just can't comprehend. For work, I am using a 16" MB Pro and I had to get a separate screen for it as I cannot connect it to my iMac for display. One would think it isn't too uncommon for iMac owners also to have a laptop.
And if I could use my iMac as a screen, I might have grabbed a M1 Mac already. Instead I am waiting on a refreshed iMac.
I use Screen Sharing to have my Macbook screen show on my iMac. Works flawless over Gigabit, except that cmd-tab sometimes lags. There are tools available[0] that allow you to set a higher resolution for a screen than what would be sensibly supported by the LCD panel. Although it doesn't do retina this way.
That's interesting to hear, because my experience is that I really really _want_ to use Remote Desktop / Screen Sharing but the performance has gotten worse and worse over macOS releases, and it currently abysmal over gigabit, and even 10-gigabit, wired Ethernet. (I could not see any difference between 1Gbps and 10Gbps which made me think bandwidth is not the problem).
I'd love for this to be some flaw in my own setup, though.
I used to use this setup circa Mac OS X 10.6 and it worked very well (1Gbps wired connection). I could leave my home office and screen share in from the living room via my MacBook Pro from the living room, when I had to watch my kids or whatever.
Today, I don't even try that. Command-Tab lags, yes, but almost everything lags to the point of being super-annoying to use. Even typing lags. I also have a Windows box in my office, and this setup basically works (even from my MBP). So I've been assuming this is one of those features that Apple has let degrade to the point of unusability.
But am I wrong? I'd like to be. Are others using Screen Sharing or Remote Desktop on macOS with success beyond like click... wait... click... whew I got a system update installed? I mean, for like typing emails or coding?
(One thing I'd been thinking of debugging was whether my office Mac having 3 5K screens is an issue — I am only trying to screenshare one of them, but maybe they have a bug or something where total pixels of the host kill perf even if those pixels aren't being shared.)
One issue I had is that it was constantly using the WiFi connection instead of the wired one. So I created a shortcut where I replaced the hostname with the wired IP address, this solved a lot of problems for me.
For my usecase of development, it works fine. No noticable lag, even as I type this comment, also video calls and web browsing/scolling/youtube are fluid on 2560x1440 using around 7MB/s.
With 3 5K screen I could image video memory or some GPU bandwidth being an issue, limiting the screensharing although sharing only 1 screen. But the problem must be either one of lag or one of bandwidth. Maybe try measuring network throughput and ping?
Maybe try using a 3rd party remote desktop system with focus on performance instead of the standard vnc server included in mac? Parsec might work for your use case. The disadvantage is these kind of remote desktop system achieve the performance by utilizing lossy codecs instead of lossless compression used by vnc, so you might notice some banding/artefacts depending on your network condition / compression level. The advantage is the hardware encoding/decoding is very fast, even fast enough for gaming.
I think it was just at the time when 5k iMacs were introduced there was no (good?) way to support that resolution over a thunderbolt or what, so support was dropped. Then it was just never added back...
The 5k iMac was really ahead of it's time in late 2014. I remember reading some technical blog talking about the custom designed driver board for that panel.
The 2015 MacBook pro could drive a 5k panel using two displayport cables (some 5k monitors from that period supported that solution for full resolution at 60Hz), and with modern displayport there should be no problems, but maybe Apple does not really want their computers used as 'dumb' displays anymore...
Apple already has Sidecar that turns your iPad into a monitor. It's not a stretch to enable that feature on Mac so you can turn your Mac into a monitor too whenever you need it for some reason.
That is another sad story. As far as I know, Sidecar works in both directions, you can use it to make your iMac a monitor for another mac. However, there is a huge catch: those two devices need to run on the same Apple ID. Which of course makes it impossible when trying to connect your work with your private computer. Of all companies, Apple should have an understanding of keeping your work files in a separate environment :p.
It seems like this was a technical limitation more than a business one. Apple had to do some very non-standard stuff for 5K and that's when they stopped supporting the target display mode. See the article as well - the only was the author could get it work was with two display port connections, so if Apple were to support target display, they'd likely have to put in two display ports in the back just for that.
It would be nice if all all-in-ones / laptops had target display mode. The versatility would be great.
The number of times I've wanted to use, say a console when the TVs taken, or use my mini-itx desktop away from home. Being able to plug into a laptop that im carrying anyway would be great.
At the same time, the latest iPad Pros connected to a Macbook via USB-C provide a remarkably convenient and performant second screen experience when doing dev or other creative (multi-screen) work on the road.
The magnetic mount of the iPad Pro magic keyboard means you can just grab the iPad and pop it back, making it super convenient to use either way.
Absolutley, I have 3 old iMacs with perfectly good displays that I could only use by running the computers attached to them (via VNC or similar). I would definitely pay 150 bucks like in TFA for a driverboard that would allow me to rip out the Mac-parts and only keep the display...
While it a useful feature, it was always an expensive and wasteful way of buying a secondary display. Arguable it was the only way of making a dual-display iMac setup look good.
I wouldn't deem it wasteful. Monitors have evolved a lot slower than computer parts generally and using a monitor for a relatively long period is quite common with desktop PC users. A midrange 1920x1200 display bought 10 or 12 years ago is quite similar in specs compared to today's low end displays.
A few years old 4K and 5K iMacs have high end integrated displays even from today's point of view, but their CPUs and especially GPUs are outdated for quite a few serious use cases. I think it would make a lot of sense to use one as a display for a new workstation setup.
But obviously my described use case is a "second chance" for an iMac having served as a standalone computer until its EOL.