Unfortunately this model seems to be sold out or perhaps even discontinued (until the European 4900U version becomes available maybe). Meanwhile, the E14 Gen2 14 is very similar in both specs and price.
I'm kinda surprised how many people set their screens to 100 % brightness (300-400 nits for most screens) on their desktop. I find that blinding and very uncomfortable. That also seems to be a reason why people complain so much about IPS bleed and glow; using the screen near full brightness in a dark room for gaming or movies. Personally I find the "0 %" setting on some screens where that's around 50 nits too bright for that (ahem LG).
https://news.ycombinator.com/item?id=24316728 is a recent comment chain about this stuff.
Not everyone knows that this impacts your eyes. Someone called me and asked if I know anything about glasses that would restrict blue light and make his eyes feel better. After telling a thing about f.lux, we ended ended up finding out that brightness was at 100%. He lowered to 50% and said: thank you, much better.
Sadly, anything below 50-60% and I notice the PWM, sort of flickering, and it's really tiring - I know I'm not buying the most expensive stuff but I'd like them to use better backlight controllers.
This effect isn’t nearly as strong on my phone’s OLED. With that, I can turn brightness down quite a long way and still have great color and contrast.
OLED/microLED desktop monitors can’t come soon enough.
Whatever dark forces AMD aligned itself with for the latest chips was worth it.
I bought this laptop after buying a Lenovo Ideapad/Yoga Slim 7 and returning it (because of some QA issues and 14" was a bit too big for me after using XPS 13 for a long time). I made a small review of the laptop here: https://www.reddit.com/r/Lenovo/comments/ilcw5n/lenovo_ideap...
I tried calling a Dutch retailer to ask if they ship to Austria but couldn't even get pass the "Do you speak English please?" phase :(
Like WTF, the EU single market is a thing since how many decades now?! So why the hell do we still have region specific SKUs of the same product with different parts and availability between EU member states?! Imagine the laptop would be available for sale in California but not in Utah and in California you can only get it with a 512 Samsung SSD and a 300nit display and in New York only with a slower 1TB SKHynix SSD and a 400 nit display! /rantover
https://europa.eu/youreurope/citizens/consumers/shopping/pri...
Usually Dutch people speak English very well, maybe you can call another store.
QHD is not a popular resolution but it's absolutely perfect for this size, like you mention. It's better for battery life than 4k and it's much crisper than 1080p.
With a 13"-14" screen, you would have to scale a 4k display which is a complete waste in my mind. You either lose out on sharpness by getting fuzzy fractional scaling, or you go full 2x scaling and you lose all the screen real estate of a high pixel density display. 1440p is beautifully sharp without requiring scaling on a 14" panel.
I use a 13" 4k at native scale. I've heard this statement so often and it's always told as a fact when clearly it isn't so for everyone.
Do you run Linux on this ? How's the performance. Would love it if you can test with a liveusb of fedora
That said, not everyone need color accuracy. It probably works just fine for my usage.
Lost their lead in consumer desktop CPUs and said ‘oh we have mobile and server’
But now AMD prove that Intel’s lead in mobile has been squandered.
And then the largest buyers of server grade stuff are cloud vendors who are waiting for ARM to come of age.
What I can say is that this level of investment is large.
Is that really so? Are remote development and emulation not sufficiently advanced yet?
Super hot paths might be x86 optimized, but how much does that really matter? I'd think at the scales of the big providers nothing matters more than performance/power use and performance/price.
Don’t underestimate what cloud vendors will do in pursuit of cost or efficiency - they are ruthless at both.
But much of the cloud is made up of services like S3, RDS, Redshift, Route 53, etc. ARM can be very competitive for many of these.
The new Apple machines are coming out next year, right?
I don't think Intel can recover, or at least not before 5 to 10 years from now.
My colleague got last year's x395 with the AMD 3000 series. It only lasts 3.5 hours battery with web browsing and moderate coding, whereas it's supposed to last ~7 hours real-world on windows.
The drivers are a huge mess of confusion (what goes into user space vs kernel? What do you really need? What even goes into host OS vs containers if you run Docker? What if anything can you actually get out of it without installing the proprietary non-free closed-source amdgpupro?), AGESA updates are needed to not have kernel modules crashes, oh and these updates are left up to mobo vendors, some of which are great, some of which will make you feel like you bought a lemon. And then there’s the whole mess with mesa that I think is just now resolved (20.1) and haven’t yet made it to LTS distros.
I’m def not an Intel fan but man, 100% working intel drivers are an apt install away and I had both forgotten just what a PITA ATI was with Linux and couldn’t imagine AMD hasn’t stepped up the game at all.
Unless anyone has anecdotal evidence otherwise, make sure you set aside a couple of working days to hunt down and compile the right kernel modules and make sure the vendor provides recent enough firmware and/or hav patience.
In short, I wouldn’t hesitate having a new Ryzen for a headless server, or a desktop rig with a dGPU. For a smaller desktop, laptop, or anything else requiring use of the iGPU I would wait a year. Unless you’re one of the few people either already up to speed on all this or finding some absurd pleasure in learning about it, in which case I really do hope you post your process in a blog or forum where other users will find it through web searches.
Granted mine is an ideapad 5 with an amd 4500u but it's been terrific. There are a handful of bugs still but nothing that prevents productive work. The worst one is where turning down the backlight too fast kills the backlight entirely, but you can fix it by just bumping the brightness up key and then going back down to your target.
* There is no driver for the Goodix USB fingerprint reader yet
* Occasionally thin lines of pixels don't update correctly (hard to describe, might get a photo soon)
* dmesg logs periodic errors with amdgpu and the new driver for the realtek wireless card, but I haven't noticed any negative functional impact associated with these.
* The TSC is marked unreliable on each boot.
That said, it's plenty usable as a daily driver, and the performance is very strong for the price point. GeekBench 5 results at https://browser.geekbench.com/v5/cpu/compare/3495992
>Running through the multitasking load that I described earlier, in battery saver mode at 200 nits of brightness, the Slim 7 lasted 13 and a half hours. On the Better Battery profile, it lasted over 11 and a half hours. Remember: I was not going easy on this thing — you’ll certainly get even more juice if you’re just clicking around a tab or two.
But six hours with your IDE open, brightness at the default level for battery mode and compiling a Node.js + TypeScript project from time to time is something you can reasonably expect to be able to do.
Currently, even though I have my charging set to max out at 80%, I don't look at the charge indicator too often because I know that I have a good few hours before it's time to plug in.
No it isn't. For that device, price ($300 with keyboard) is it's major selling point.
So the thing is, if you never had one yourself and only observed others using them while away from their preferred work spaces, chances are you've literally seen glossy screens in a bad light :)
Of course, if your use cases are all text based, the potentially better picture quality of a glossy screen is indeed rather pointless. Either option is a compromise; what's better depends entirely on where and for what you use the device.
(Also: PRETTY! and APPLE! are important psychological factors as well :P. Joking aside, many people will look at vibrant screen and how pretty it looks and make their decision without considering the specs, details and use cases )
For myself, in most situations glossy screens mostly mean I can see everybody behind me better than I can see my work; but they do have their place, especially for professional design/video/photo work IN (and this cannot be overstated) well controlled environment (basically, make everything BUT the screen non-reflective / control the light:).
Also, "look great" is stretching it a bit - they tend to make things look slightly softer and desaturated.
https://nakedsecurity.sophos.com/2015/02/20/the-lenovo-super...
I really really wish Apple would make a reversible 2 in 1. I can’t tell you how much of a better experience that form factor is for young kids. iPads are not a replacement for this.
From my experience you get what you pay for with those professional machines.
i only paid $100 for it on ebay, so i can't really complain, but it doesn't seem much different to the average consumer laptop to me.
I just bought a CTO T14 w/4750u, 16GB+0, 400nit screen for ~$830 + tax, less than the price of the Slim 7.
Trying to find a Ryzen 4xxx with 16gb of ram in the UK is quite hard! Plenty of Intel's though, it's almost like Intel flooded the market or no one wants Intel
AMD had to prebook fab capacity at TSMC years in advance and didn't expect the shortages caused by the pandemic made worse by the WFH demand while Intel can make as many chips as it wants to fulfill market demand since it owns the fabs.
It's a shame because the 4800U laptops are either sold out or going for huge markups right now.
Think it might play in that role. Decent number of cores yet modest TDP seems like a good fit
Also, I think the Ryzen 5 is probably better value ratio but wasn't available for buy so went for a 7.
I believe Asrock is also gonna bring out similar stuff but don't know details.
Under Pop_OS with minor tweaks the battery lasts about 5 hours, which is pretty good for a Linux laptop.
Think of turning of Wifi energy saving: Wifi speed went from 50Mbps to more than 300.
These processors are great and OEMs could offer features users want, but they've still been offering only mid-range or gamer-oriented builds for everything else.
Tbh I would jump to any high res oled 13 inch laptop almost instantly. Screen is one of the most important parts for me.
Any review will be helpful.
My last laptop was a 15″ 1920×1080, and my current is a 13″ 3000×2000 (and I love the aspect ratio). I intend never to buy a laptop with a 1920×1080 screen again.
At some point I’m afraid I may end up with a ≥120fps screen and rarefy my tastes still further. (I hear good things about them, but have never seen an LCD with such a frame rate.) Fortunately screens with both a high frame rate and a high resolution are still vanishingly rare.
I just hope someone comes out with a good screen on one of these—I barely even care if it’s super expensive; because it’d be a real shame to have to decide between a good screen and a good CPU.
They're huge power sinks. If you need to play games on the laptop for some reasons and can have it plugged in all the time, it works. But 4k often take 1-2h off the battery life and the high frame rate will likely have an impact too.
I don't mind 1080p, or 60hz, but I'd love proper HDR OLED for work, so I can have high contrast together with lower brightness.
Yeeks.
Same here and WOW, it's frustrating. There are so many laptops that would be amazing if it wasn't for their screens. Paradoxically 17" models are almost all crap.
Agreed, but there are plenty of dim IPS panels with poor contrast and color.
A couple weeks later, I got notified that that due to supply issues for certain components, the date was pushed, without a specific date, and I could cancel if I wanted or take a discount when it finally shipped.
This week HP said they cancelled my order altogether, with a list of cancelled business laptop skus. All of them were because of lack of AMD CPUs.
Supposedly I can get a discount on something else, but there's just not much to pick from.
However, 200% scaling is crisp and I can appreciate it. I just don't like losing all that real estate. And the fractional options on MacBooks aren't bad, but I can see text fuzzing out when it's not on the real Retina resolution. So when I'm running without an external display I do bite the bullet and deal with the lower resolution because otherwise I can feel my eyes straining.
[0]
Model | Physical | Max | Default | Retina
16" | 3072x1920 | 2048x1280 | 1792x1120 | 1536x960
15" | 2880x1800 | 1920x1200 | 1680x1050 | 1440x900
13" | 2560x1600 | 1680x1050 | 1440x900 | 1280x800That left a really bad impression - I get that they are too lazy to actually measure the performance, but the snooty "synthetic" was uncalled for and frankly disrespectful to the people doing the work they are too lazy for.