👉 Best Laptops for Data Science: www.justjosh.tech/recommendations/Best-Data-Science-Laptops 🍎 - Apple MacBook Air 15: bhpho.to/3XUuise - Apple MacBook Pro 14 M4 Pro: geni.us/xSC3S - Apple MacBook Pro 16 M4 Max (14C): geni.us/t1YhrjD - Apple MacBook Pro 16 M4 Max (16C): geni.us/CsQrXNB 🪟 - Lenovo Yoga 7i 15 Aura Edition: geni.us/sQUv - Asus ProArt P16: geni.us/wqlnPaN - Lenovo Yoga Pro 9i 16: geni.us/n5UqzGi
@@smohan123 Watching someone, creating an opinion on people and trusting some ppl ore then others is not 'persona driven buying advice'. Its only problematic if you just buy what he tells you without thinking yourself about it, which nobody here claimed as far as I am aware.
I am a professional Data Scientist and here are the things that I prefer in a laptop: Minimum requirements: 15+ inches and 16GB RAM Preferred: 32 GB RAM Case dependent: - If you use something like Excel (or other Office Suite) or in memory analytical libraries like pandas a lot, prefer laptops with a high single core performance. Also, choose the RAM based on how much data you process at once. - If you perform distributed analysis or build CPU based models (like scikit-learn), prefer high multi-core performance and high thread count. - For deep learning models for tabular data, I would prefer faster GPUs over VRAM. But depends on the scale of the data. - For deep learning models for images or LLMs, I would prefer GPUs with high VRAM over raw GPU performance. My suggestion would be 8GB+. You could get a lot more done with a higher batch size. - A good keyboard is good to have but should not be the main buying decision imo unless you are a super Excel user or something. This is since most programmers, including myself, spend most of the time reading code and data and a lot less typing. - If you use a laptop for professional use, most likely you would be using a desk style setup. In that case, a long battery life and a nice trackpad is nice to have but should not be the focus. In a desk setup, it is better to plug in the laptop and use a mouse. - For deep learning models, I prefer Nvidia GPUs since I feel like it is more supported and easier to set up. - Screens are subjective. Choose the screen type based on your budget and how likely you are to use the laptop for other stuff (gaming, Netflix, YT). Choosing the screen specifically for coding is a bit overrated imo. Bonus: - If you use Excel a lot or any IDE (VS Code, PyCharm, etc.) a lot, buy a wide screen monitor. It is a game changer for development. These are just my opinions and preferences.
@@kots9718 Code is longer vertically but most IDEs have more UI horizontally (sidebars, explorers, minimaps). Wide screen means you don't have to hide or disable them. Use of wide screen is mostly about having multiple things open at the same time that prevents a lot of switching and scrolling. For instance, you can open the blog you are referring alongside your code editor (or even split tabs on the same screen as in Edge) I use jupyter notebooks a lot and in Jupyter lab or VS Code, you can open multiple views of the same file side by side. No need to scroll too much to see what you wrote at the top. It is great for git diff for the same reason in any editor. Referring to any related file is easy for the same reason. I typically have one or two files and a terminal open side by side in VS Code at the same time. In most IDEs you won't have to close most things. You can just dock them to the very left or the right.
DS here who managed to get an M2 Max with 96GB at work. Love the device. But don't kid yourself. You will not do any significant LLM training on this thing either. Its fine if you stay below that 1B weights range I would say. If you think about buying something for personal use just get a cheaper device + setup a local workstation and ssh into that. Will also help you build up your skills
@@kingamv107 No, it does not require that much, but compute is limited and for training on more samples you will probably want to change to the cloud anyways. Have not tried doing any finetuning of 1B+ models on the Mac
As a writer who trains our company's in-house AI models, I make it a point to watch each of these videos. Even when the content doesn't directly relate to my work, I find that there's always valuable insight for my laptop search.
Never think a laptop is good for AI / ML, you don't want a long running task living in your laptop, and make everything else awkward (annoyed by the heat but can't do a pause / resume, accidentally close the lid, etc). Would rather build and ITX pc and ssh into it.
Also the long running tasks don't live on the laptop. The laptop is for developing the models. You only need to know that your code is working on the laptop. After that, the work is done on the cloud. So your problem doesn't exist.
As a former Insurance underwriter turned IT analyst, most of the time you are in an office or working from home. 300 nits of brightness are perfectly fine. Lenovo ThinkPad T14 GX, Dell Latitude 7XXX, and HP EliteBook 840 GX will be the laptops assigned to most data analyst workers by their company.
@@akin242002 bro how many ram your laptop for running Data Analisyst? What's faster and not langging for running in Keggle? i buying Lenovo Yoga 7 Ryzen 7 6800U 16gb/1TB for Data Analisyst junior, like for internship, portofolio, and experience Data Analisyst during 2 years
I'm really not convinced about the case for LLMs on laptops, or even desktops. I don't work in AI, but conventional software engineering went through this over a decade ago with the transition from local and on-premise CI builds (e.g. Jenkins) to cloud CI (Travis, GHA and so on). I think it's beyond the scope of this channel, but if I was making this decision I would start with "what is it going to cost me in terms of cloud compute, and so what's the ROI period for a more expensive laptop capable of running the same work locally?". There are other benefits of running stuff on cloud, like greater flexibility in terms of instance sizing, and simply being able to close your laptop and let stuff run while you do other things.
For AI/ML, depending on how large the model you're working on, you can still use your laptop offline just for testing (if your computer can), once you're sure your model works, then you can offload it to the cloud and keep working on it there. This way the costs will stay low, because it doesn't make sense spending tons of money on something you're not sure when will it work or if it will work at all.
@@themedleb yes, but "it doesn't make sense spending tons of money on something you're not sure when will it work or if it will work at all" also applies to buying a super-expensive laptop. A year or two ago I bought a 32GB laptop with a 3060 GPU and quickly realised it was severely under-specced for the LLM experiments I wanted to do. Cloud is always going to be more flexible. That said, I defer to the experience of people that are doing this stuff professionally.
You are completely on point. While doing my PhD in ML I needed to change my machine far too often. With the recent LLM trend it doesn’t make sense to run models locally when there are so many cheap cloud solutions available. The price I’m saving by going basic MacBook can give me cloud compute for years. For some tasks running locally isn’t an option and for those that can be run locally, costs on cloud are almost free (because of it being a commodity and advantages of dedicated hardware)
@Hundredthldiot I agree that the cloud is much more flexible and isn't limiting, in my opinion, it is the best overall for these kind of things especially for a good price, but for your case, you should have analyzed your needs before assuming a certain GPU will fit your needs, that is not 100% your fault though, AI is progressing rapidly for the past few years, and with this progress the requirements are going high too.
@@themedleb Sure, but LLMs wasn't my primary use case, and it only cost 920 euros (Yoga Slim 7 Pro X, great recommendation by Josh!) so it was a cheap experiment. In the end the limitation of that laptop was battery life and general ergonomics so the Yoga is now desk-bound and I've gone back to my 2020 M1 MBA for mobile use. Honestly, I still think Macbooks are unbeatable unless there are application compatibility issues.
Since I just spent quite a bit of time doing research and running tests, I just want to note that the M4 Max 40CU (top of the line), sadly only has 34.08 TFLOPS of FP16. This is roughly equal to a desktop RTX 4060 in terms of compute. A mobile RTX 4090 (which is pretty cut down from the desktop version) will still have twice the Peak FP16 Tensor TFLOPS w/ FP32 Accumulate (also, 264 INT8 TOPS for quantized inference). Based on 40 RDNA3.5 CUs, Strix Halo should have just shy of 60 FP16 TFLOPS (but only 256GB/s of MBW vs the 576 GB/s that both the Mobile 4090 and M4 Max have). For reference, a desktop 4090 will have 165.2 Tensor FP16 TFLOPS (FP32 Accumulate) and 1008 GB/s MBW. Bottom line: line, if you're doing local training, don't use a laptop unless you *really* have to.
I thought the M4 Max 40 core was getting 18 TFLOPs of GPU power. Still good enough to compete with the laptop version of Nvidia RTX 4070. 34 would be massively off from the sources I looked into. Disinformation to an extent on whichever side is off.
But what if I don’t have to train the whole foundational model from the ground up? I just want to train the edge weight parameters using QLoRA and I’m willing to go down to INT4 or FP4 4-bit precision?
3:06 having a high refresh rate display lessens the strain on your eyes when you see moving content on screen. Which makes a big difference while I'm at work
Excellent as always Josh. Please also review laptops for heavy AI based video editing workflows like using AI based functions in Davinci Resolve Studio
Wait what?! I don't believe it is possible to use a laptop for something different than content creation. It should not be allowed! 'Sad' thing is that who needs a laptop for other stuff then 'EXPORTING MASSIVE 4K VIDEOS" they already know what they need or want. However, this is a great educational video and very professional. Thank you!
Please make a video on the best laptops for UI/UX DESIGNING and CYBER SECURITY. Thank you. I love your works. You narrow all explanations to the best way.
As a machine learning engineer, the best laptop for my work right now is the Apple MacBook Pro. The primary reason is its unified memory, which allows me to load large language models directly into my 96GB M2 Max MacBook Pro and get used by GPU-something no non-Mac laptop currently offers. This feature is invaluable for prototyping and testing models efficiently. Secondly, for machine learning work, we often spend significant time manipulating data in Pandas data frames, and for reasons beyond just clock speed, the Apple M-series chips consistently outperform x86 chips in these tasks. With the release of the M4 chip, this performance gap has only widened. Lastly, Windows simply isn’t ideal for data science. Many libraries don’t install smoothly on Windows due to various compatibility issues. While the Windows Subsystem for Linux (WSL) is helpful, achieving the full benefits often requires maintaining a separate Linux OS, which adds complexity.
So, as someone who is trying his best to avoid Macs, I guess you'd probably say it's better to wait for Strix Halo laptops with unified memory rathern than buying a Windows laptop now.
@@higochumbo8932 Generally, aside from a MacBook Pro, I prefer either a cloud setup or a desktop workstation, at least 96 G mem with an a NVIDIA GPU such as RTX 4090. However, if you really want to use a laptop, and your projects don’t involve large language models, I’d still recommend a laptop with an NVIDIA GPU and it is running Linux. Window is extremely bad for machine learning. If you really want to use window, I strongly recommend Window Subsystem for Linux - 2 (Window 11 has it by default)
@@higochumbo8932honestly as someone with a windows laptop with 64gb of ddr5 ram and an intel 13th gen cpu who’s finding it very frustrating working with windows and a cpu that gets very hot I my move over to a MacBook now I can buy a mid tier pro model with 48gb of ram.
Are you a mind reader? So glad I found your channel recently and this is exactly a topic I wanted to know more about. Awesome but trippy. Great channel by the way. Keep it up. You got my subscription.
Thanks for making a video on laptops for programmers, really appreciate your effort 👏👏. I have commented on a previous video asking for laptops for people like me and here it is .
I came across this channel just when I was looking into potential laptops for my university work and this is a gem! Wondering if you could also cover laptops catered to the drug discovery and bioinformatics sector, as they require mid to high laptop specs. I'm a student that does casual tasks but also working on some 3D visualization softwares, programming and may do slight gaming, but require a lightweight chassis. Are there any suggestions? I was looking into G14 2024 but don't know if it still has warm issues after the updates now. Thanks for the informative videos, love it and subscribed!
Thx for the great coverage. My workflow involves running AI LLM Model locally as well as Data Analytics/Science. I am considering either the latest Intel/AMD CPU + nVidia GPU or the MacBook Pro with M4 Pro to upgrade my current setup.
I have to say this method of describing different individuals and their needs regarding their tech is a pretty straightforward and efficient method of helping put perspective in new buyers. You and your team nailed it.
Really good look at these devices for these workloads, but would’ve loved to have seen some commercial Windows PCs compared rather than their consumer counterparts.
When it comes to handling large datasets, the Mac is game changing. I would regularly analyze large datasets over 10K in size and on my 9th gen I9 windows machine, the fans would spin up like crazy and and would take forever just to even manipulate the files. Then my company got me an M2 Max, and the Mac was able to run the same operations without breaking a sweat.
i've tested the surface laptop 7 witgh snapdragon on power bi and the performance surprised me, even while being emulated from x64 to arm64. basically can sustain intels last gen H series processor's speed ...
awesome video. I am data analyst doing some data science tasks as a side job and I decided to go with pc instead of laptop. the computational power you get for the same budget is just worth it if you are working from home. nevertheless lenovo models were in my list when i was doing research + ASUS ROG Zephyrus
If you are connecting to a super computer server like in the 2nd example from Josh, then it doesn’t really matter which OS you use to initiate the connection. In my experience using both a windows laptop and a macbook, I can tell you that the macbook experience has been smoother and connectivity seems to be more stable overall. A simple example for why a macbook might be more preferable is to imagine never having to restart your laptop and as soon as you open the lid continuing your analysis work on a completely silent environment…
Josh, just because most 13gen intel laptops are loud and hot doesn’t mean all are like that. I have the asus zenbook pro 14 ux6404vv with 4060m And yes i had to replace the thermal paste and i use a self made fan control-curve software i wrote in C# Now i can exactly control the fans the way i want. Its a 13700H laptop, gets 108/1112 on CB2024 and GPU can sustain 125w. Its very quiet unless pushed and now with my controls always cool to the touch! By the way its for sale since i move to macOS. Region Europe
@johnconstantine5228 Legion is not an option: unprofessional look, bulk, huge powerbrick, keyboard with numpad offset to the left, bad touchpad, screen is subpar. I'm considering Thinkpad P1 gen 7, but it has its own flaws (powerbrick, screen). And I'm not even touching the subject of working on battery. Unfortunately, hardwarewise, I don't see any real competition to Mac.
This video was PROFOUNDLY useful. I did wonder why the Samsung Galaxybook4 Ultra wasn't included since it's light, has a long battery, and also a RTX 4070. I had been leaning towards one but since i'm in more of an "Agnes" role. Was that just an oversight? Or is it not recommended for a reason?
Great video! I would include ASUS ProArt PX13, which is a unique 13" convertible laptop with RTX 4060 GPU, and Zephyrus G14, which is a good all-round 14" laptop with RTX 4070 GPU.
@@leonardomurakami5125 I'm very close to pulling the trigger on that exact model. At the moment I have a Legion 5 gen 5 (Ryzen 7 4800H), and a Legion 7 gen 7 (Ryzen 7 6800H). The Legion Slim 5 will most likely be my next purchase (without a month or so). I just really hope it plays nice with Linux.
It's also gonna depend on if it's your home laptop or work laptop. Your full time job work might be all cloud sure. But what about your own projects? Part time freelancing projects. Passion projects. Start up projects, etc. You can't use the same laptop as your day job for those, so in that case you want your home laptop to be able to double as second job work laptop.
Unless my job was in finance I would never get a numbpad laptop as the shift to the left really screws up your typing and even after a while its hard to orientate your self without looking at the keyboard sometimes imo
I recently purchased a Legion 7i laptop equipped with an i9 14th generation processor, an RTX 4060 graphics card, and upgraded its RAM to 64GB. Based on my experience with model training, I would recommend considering an i7 14th generation processor paired with a more powerful graphics card. During model training, the GPU usage typically reaches 99%, while CPU usage hovers around 25%, and RAM usage ranges from 20 to 30GB.
As a student in the industry I chose the value options (saving about 2K compared to what is shown here while still getting a fast 16GB gpu ;D) and got a Thinkpad yoga L13 g2 for 150$ and a PC with a 14 core xeon, 128GB ddr4 and an rx 6800xt for 600$ (both second hand). I also use Arch btw.
I'd absolutely love too see you talk about other laptop manufactures besides these giant companies. Im using a TUXEDO laptop right now and I feel like it can easily compete with most laptops you show in your videos. If you dont want to use LINUX you can still use it with windows (also preinstalled if you want it), personally Im running windows (dual boot) and it works perfectly fine with no way of telling that the laptoop wasn't specificly made for windows. TUXEDO especially has really nice laptops in the range of 14"-16" and >1000 and
My current laptop has a 3k 120Hz display, a R7 8845HS, 60Wh, RAM (6400Hz) 32GB, nice keyboard (imo) and a good touchpad. Its possible to shut off things like the camera in the UEFI settings and has a custom made OS, to efficiently use the provided hardware (TuxedoOS, Ubuntu fork). Its also possible to freely choose between a wide range of GPUs, CPUs, SSDs and RAMs, not on every laptop the same way but you always have a many options for SSDs and RAMs, so you dont have to sacrifice anything like you have to when you see a good laptop from eg lenovo, but for some reasons it only has a slow SSD and 16GBs of RAM... AND as mentioned before, Windows runs perfectly fine and I didn't feel any kind of performance drop on it.
Awesome video Josh! What are your thoughts on the MSI Stealth 16 AI Studio? Core Ultra 9 185H, 32GB RAM (not soldered, can be upgraded to 96 GB), 16" 3840x2400 120HZ miniled display, 2 TB storage, RTX 4080, 1.99Kg weight - this version in Europe costs an amazing € 2699. There's also a version with an RTX 4090 for € 3199...
Numpad is the most important requirement whenever I buy a laptop, a 16'' display is rather a nice bonus. I do some basic ML/scientific calculations using Python libraries, so for me CPU, especially single-core productivity is super important. I bought a Lenovo ThinkBook 16 G6 IRL, it's perfect for my needs. - i7-13700H up to 5 GHz - 32 GB DDR5 5200 - 1 TB NVME SSD I don't know why you pay so much attention to screen resolution, it's important for large displays >21'', but for laptops, it seems overvalued.
Could have mentioned the g16 amd 4070. While not as high ppi, it has a very fast refresh rate and brighter screen than the p16. Also no screen door effect.
Would the acer predator helios neo 16 and comparable laptops like legion, rog, etc do well with AI/ML/DS/DA tasks? The specs seem on par with the 7i and 9i mentioned. I'm wondering if its moreso the portability of the yoga laptops that onto this list or what the biggest defining feature that made them a favorite.
I think my use case will be more like Agnes. I’d like to run LLMs and perhaps other multimodal models (and perhaps Stable Diffusion for imaging) locally. For the purpose of not having to pay per token and for privacy concerns with Cloud services, I really like to run both the model (training/inference) locally. I heard that while inferencing is not that memory intensive, it needs high memory bandwidth (256bit memory bus, 576GB/s bandwidth on the 4090) I need something powerful enough to fit the models like a Llama3.1-70B on to the GPU. I’m looking to fine tune with QLoRA and potentially incorporate a RAG with a vector database. Would a mobile 4090 with 16GB VRAM be enough, or will I need to only look at GGUF quantized models and reduced FP4/INT4 4-bit precision, and paging optimization to fit everything onto the GPU. If that’s the case, could I go to a 4080 with 12GB VRAM or 4070 with 8GB? I also need to be able to run multiple VMs the laptop and maybe setup a kubenetes cluster for testing purposes… maybe 2-3 nodes. I need to do it all locally on the laptop and bring it to meetings where the company is air-gapped where I can connect to projector (via display port of HDMI) to demo everything and show my Jupyter notebooks.
Another great one, nicely done Josh! Can i pick your brain for just a bit...I'm looking to switch to an M4 Pro once your reviews are out & provided everything is fine. I currently virtualise windows on ARM on my mac & run Power BI Desktop therein, whenever I need to work with it...16GB is jusssst about right but teetering on the edge for sure....what is the best value model of the M4 Pro you'd recommend? I'm fine spending a little more, but wanna get the best therefrom...
Dell Precision, HP Zbook and Lenovo ThinkPad P series are the best . U will appreciate a Dell precision when working with Deep Learning model. .... Infact any mobile workstation is better ...
Dell Precision or HP ZBook with a Nvidia RTX 4090 GPU is currently impossible beat for machine learning on the go. The GPU power is so immense, that NASA uses both of those workstations. The closest thing is M4 Max, but that doesn't have the GPU power to compete with those two. It is lighter to carry around and has better battery life if that counts.
Hi Josh. I love your vid. You are have a lot of experience with computer. I got a question. Do you use stil these thick computer books to learn new computer knowledge or is it all online today. How do you learn CS today?Thank for your work
One callout about using Macbooks for ML and AI is that they don't support current model quantisation methods. In some sense, you're trading off half of the Macbook's high RAM compute by having to load models in at double the precision (or more) compared to loading them in CUDA with BitsAndBytes backend. Currently 8bit (or 4bit for very large models) is the precison sweet spot. Yet on Mac the lowest precision supported is just 16bit (half precision). Great video though. I personally use the MacBook Pro M3 and the G16 HX370. Very happy with both, though I generally do cloud training.
Unfortunately nothing comes close to the battery on a Mac. I just made the expensive mistake of testing a new Intel lunar lake machine with Linux I had to charge it twice a day for 2 days before I decided to return it. 😅 $1000 mistake losing 20% on the restocking fee 😅
I am currently seeing another round of creators-sponsored "reviews" for X-Elite laptops and they are still quoting 24-28 hours of battery life, trying to fool people. I have tested them all and none come close. I would say 8 hours of SOT doing multitasking work, not watching movies. I get 50-100% more SOT on my Macbook. I suspect your example with the new Intel is more informative than the recent wave of sponsored ads by YT creators quoting ridiculous unreal numbers. As you say, Macs are the best for efficiency especially when multitasking.
Hey Josh, i am a AI developer student from Europe. Have you any recommendation for a 14 inch system? If so, i myself like to work with mlflow operation in docker images. Powerful enough for PyTorch and Tensforflow and Keras DeepL operations.
I am going to college next year and I plan on getting my masters. I want to learn cyber security for my undergrad, and then machine learning for my masters. When choosing a laptop, there are lot like you have mentioned, but I was mainly looking at the 16 inch macbook pro m4 pro chip, vs the Lenovo Yoga pro 9i. I would get the macbook, but the problem is the operating system. I know for buisnesses they mainly use windows and not that much on Apple. I heard that they are moving a little bit with Apple. Do I get the Lenovo Pro 9i, or the 16 inch macbook pro m4 pro chip?
👉 Best Laptops for Data Science: www.justjosh.tech/recommendations/Best-Data-Science-Laptops
🍎
- Apple MacBook Air 15: bhpho.to/3XUuise
- Apple MacBook Pro 14 M4 Pro: geni.us/xSC3S
- Apple MacBook Pro 16 M4 Max (14C): geni.us/t1YhrjD
- Apple MacBook Pro 16 M4 Max (16C): geni.us/CsQrXNB
🪟
- Lenovo Yoga 7i 15 Aura Edition: geni.us/sQUv
- Asus ProArt P16: geni.us/wqlnPaN
- Lenovo Yoga Pro 9i 16: geni.us/n5UqzGi
This type of video demonstrates exactly why the justjosh channel is on another level to other tech/laptop channels on youtube
Alex Ziskind's channel is good too!
Except he forgot to mention aspect ratio. Verticality is so key for a data-cruncher laptop.
Persona driven buying advice is really smart. Super well done
@@smohan123 Watching someone, creating an opinion on people and trusting some ppl ore then others is not 'persona driven buying advice'. Its only problematic if you just buy what he tells you without thinking yourself about it, which nobody here claimed as far as I am aware.
🤣🤣🤣🤣🤣 debatable
I am a professional Data Scientist and here are the things that I prefer in a laptop:
Minimum requirements: 15+ inches and 16GB RAM
Preferred: 32 GB RAM
Case dependent:
- If you use something like Excel (or other Office Suite) or in memory analytical libraries like pandas a lot, prefer laptops with a high single core performance. Also, choose the RAM based on how much data you process at once.
- If you perform distributed analysis or build CPU based models (like scikit-learn), prefer high multi-core performance and high thread count.
- For deep learning models for tabular data, I would prefer faster GPUs over VRAM. But depends on the scale of the data.
- For deep learning models for images or LLMs, I would prefer GPUs with high VRAM over raw GPU performance. My suggestion would be 8GB+. You could get a lot more done with a higher batch size.
- A good keyboard is good to have but should not be the main buying decision imo unless you are a super Excel user or something. This is since most programmers, including myself, spend most of the time reading code and data and a lot less typing.
- If you use a laptop for professional use, most likely you would be using a desk style setup. In that case, a long battery life and a nice trackpad is nice to have but should not be the focus. In a desk setup, it is better to plug in the laptop and use a mouse.
- For deep learning models, I prefer Nvidia GPUs since I feel like it is more supported and easier to set up.
- Screens are subjective. Choose the screen type based on your budget and how likely you are to use the laptop for other stuff (gaming, Netflix, YT). Choosing the screen specifically for coding is a bit overrated imo.
Bonus:
- If you use Excel a lot or any IDE (VS Code, PyCharm, etc.) a lot, buy a wide screen monitor. It is a game changer for development.
These are just my opinions and preferences.
Yeah but for machine learning we need vram and cuda cores but I am still a newbie
@@Crazy_CJ_ With wide screen you mean ultrawide?
@kots9718 most say you can see the ui elements better but I will disagree
@@kots9718 Code is longer vertically but most IDEs have more UI horizontally (sidebars, explorers, minimaps). Wide screen means you don't have to hide or disable them.
Use of wide screen is mostly about having multiple things open at the same time that prevents a lot of switching and scrolling. For instance, you can open the blog you are referring alongside your code editor (or even split tabs on the same screen as in Edge)
I use jupyter notebooks a lot and in Jupyter lab or VS Code, you can open multiple views of the same file side by side. No need to scroll too much to see what you wrote at the top. It is great for git diff for the same reason in any editor. Referring to any related file is easy for the same reason. I typically have one or two files and a terminal open side by side in VS Code at the same time.
In most IDEs you won't have to close most things. You can just dock them to the very left or the right.
@@kots9718 I wrote a long decent reply to how I use wide screens and somehow it is not showing up now. Stupid YT.
DS here who managed to get an M2 Max with 96GB at work. Love the device. But don't kid yourself. You will not do any significant LLM training on this thing either. Its fine if you stay below that 1B weights range I would say.
If you think about buying something for personal use just get a cheaper device + setup a local workstation and ssh into that. Will also help you build up your skills
what do you mean? 1B weights would require 96GB? what about 8.72B params?
@@kingamv107 No, it does not require that much, but compute is limited and for training on more samples you will probably want to change to the cloud anyways. Have not tried doing any finetuning of 1B+ models on the Mac
As a writer who trains our company's in-house AI models, I make it a point to watch each of these videos. Even when the content doesn't directly relate to my work, I find that there's always valuable insight for my laptop search.
I really appreciate that
An underrated feature for Excel heavy users is the full-size arrow keys.
For me, it's a non-negotiable.
I just started studying data science so this is the perfect video for me, thanks josh👍🏾
Never think a laptop is good for AI / ML, you don't want a long running task living in your laptop, and make everything else awkward (annoyed by the heat but can't do a pause / resume, accidentally close the lid, etc). Would rather build and ITX pc and ssh into it.
What job is going to pay to get you both an ITX and a laptop?
Also the long running tasks don't live on the laptop. The laptop is for developing the models. You only need to know that your code is working on the laptop. After that, the work is done on the cloud. So your problem doesn't exist.
Using personas to calibrate the recommendations to prototypical use cases. Excellent quality video!
As a former Insurance underwriter turned IT analyst, most of the time you are in an office or working from home. 300 nits of brightness are perfectly fine. Lenovo ThinkPad T14 GX, Dell Latitude 7XXX, and HP EliteBook 840 GX will be the laptops assigned to most data analyst workers by their company.
Giving anyone who consistently views tabular data a screen any smaller than 15” is a crime.
@@CitAllHearItAll At my company, we just assign them a 16-inch HP Elitebook (860 GX) or a Lenovo ThinkPad T16 G3.
@@akin242002 bro how many ram your laptop for running Data Analisyst? What's faster and not langging for running in Keggle? i buying Lenovo Yoga 7 Ryzen 7 6800U 16gb/1TB for Data Analisyst junior, like for internship, portofolio, and experience Data Analisyst during 2 years
This is one of your best videos yet! Excellent content organization, pacing, charts📊, the detailed recs and scenario analyses. Just wow.
Keep it up!👍
7.5 minutes in.. just wow, amazing data, love how you present the different processors, and the tier chart for fan noise/heat is incredibly helpful
Thanks for noticing :) I thought that fan noise chart would be a helpful graphic
I'm really not convinced about the case for LLMs on laptops, or even desktops. I don't work in AI, but conventional software engineering went through this over a decade ago with the transition from local and on-premise CI builds (e.g. Jenkins) to cloud CI (Travis, GHA and so on). I think it's beyond the scope of this channel, but if I was making this decision I would start with "what is it going to cost me in terms of cloud compute, and so what's the ROI period for a more expensive laptop capable of running the same work locally?". There are other benefits of running stuff on cloud, like greater flexibility in terms of instance sizing, and simply being able to close your laptop and let stuff run while you do other things.
For AI/ML, depending on how large the model you're working on, you can still use your laptop offline just for testing (if your computer can), once you're sure your model works, then you can offload it to the cloud and keep working on it there. This way the costs will stay low, because it doesn't make sense spending tons of money on something you're not sure when will it work or if it will work at all.
@@themedleb yes, but "it doesn't make sense spending tons of money on something you're not sure when will it work or if it will work at all" also applies to buying a super-expensive laptop. A year or two ago I bought a 32GB laptop with a 3060 GPU and quickly realised it was severely under-specced for the LLM experiments I wanted to do. Cloud is always going to be more flexible. That said, I defer to the experience of people that are doing this stuff professionally.
You are completely on point. While doing my PhD in ML I needed to change my machine far too often. With the recent LLM trend it doesn’t make sense to run models locally when there are so many cheap cloud solutions available. The price I’m saving by going basic MacBook can give me cloud compute for years. For some tasks running locally isn’t an option and for those that can be run locally, costs on cloud are almost free (because of it being a commodity and advantages of dedicated hardware)
@Hundredthldiot I agree that the cloud is much more flexible and isn't limiting, in my opinion, it is the best overall for these kind of things especially for a good price, but for your case, you should have analyzed your needs before assuming a certain GPU will fit your needs, that is not 100% your fault though, AI is progressing rapidly for the past few years, and with this progress the requirements are going high too.
@@themedleb Sure, but LLMs wasn't my primary use case, and it only cost 920 euros (Yoga Slim 7 Pro X, great recommendation by Josh!) so it was a cheap experiment. In the end the limitation of that laptop was battery life and general ergonomics so the Yoga is now desk-bound and I've gone back to my 2020 M1 MBA for mobile use. Honestly, I still think Macbooks are unbeatable unless there are application compatibility issues.
today i was thinking that it would be good to see this subject on your channel and boom! here it is! thanks as always ❤
Since I just spent quite a bit of time doing research and running tests, I just want to note that the M4 Max 40CU (top of the line), sadly only has 34.08 TFLOPS of FP16. This is roughly equal to a desktop RTX 4060 in terms of compute. A mobile RTX 4090 (which is pretty cut down from the desktop version) will still have twice the Peak FP16 Tensor TFLOPS w/ FP32 Accumulate (also, 264 INT8 TOPS for quantized inference). Based on 40 RDNA3.5 CUs, Strix Halo should have just shy of 60 FP16 TFLOPS (but only 256GB/s of MBW vs the 576 GB/s that both the Mobile 4090 and M4 Max have). For reference, a desktop 4090 will have 165.2 Tensor FP16 TFLOPS (FP32 Accumulate) and 1008 GB/s MBW. Bottom line: line, if you're doing local training, don't use a laptop unless you *really* have to.
I thought the M4 Max 40 core was getting 18 TFLOPs of GPU power. Still good enough to compete with the laptop version of Nvidia RTX 4070. 34 would be massively off from the sources I looked into. Disinformation to an extent on whichever side is off.
But what if I don’t have to train the whole foundational model from the ground up? I just want to train the edge weight parameters using QLoRA and I’m willing to go down to INT4 or FP4 4-bit precision?
@@akin242002 18 TFLOPS of FP32 performance translates to about 36 TFLOPS of FP16 performance
@@Ro1andDesign Thanks!
@@tsizzle QLoRA uses 4-bit storage data types (saving memory), but computation still happens in FP16/BF16.
3:06 having a high refresh rate display lessens the strain on your eyes when you see moving content on screen. Which makes a big difference while I'm at work
Excellent as always Josh. Please also review laptops for heavy AI based video editing workflows like using AI based functions in Davinci Resolve Studio
Wait what?! I don't believe it is possible to use a laptop for something different than content creation. It should not be allowed! 'Sad' thing is that who needs a laptop for other stuff then 'EXPORTING MASSIVE 4K VIDEOS" they already know what they need or want. However, this is a great educational video and very professional. Thank you!
Please make a video on the best laptops for UI/UX DESIGNING and CYBER SECURITY. Thank you. I love your works. You narrow all explanations to the best way.
Another good topic.. thanks Josh! Surprised that Dell XPS is not made to the list..
Just wacthed this 4 minutes after it was posted!
As a machine learning engineer, the best laptop for my work right now is the Apple MacBook Pro. The primary reason is its unified memory, which allows me to load large language models directly into my 96GB M2 Max MacBook Pro and get used by GPU-something no non-Mac laptop currently offers. This feature is invaluable for prototyping and testing models efficiently.
Secondly, for machine learning work, we often spend significant time manipulating data in Pandas data frames, and for reasons beyond just clock speed, the Apple M-series chips consistently outperform x86 chips in these tasks. With the release of the M4 chip, this performance gap has only widened.
Lastly, Windows simply isn’t ideal for data science. Many libraries don’t install smoothly on Windows due to various compatibility issues. While the Windows Subsystem for Linux (WSL) is helpful, achieving the full benefits often requires maintaining a separate Linux OS, which adds complexity.
So, as someone who is trying his best to avoid Macs, I guess you'd probably say it's better to wait for Strix Halo laptops with unified memory rathern than buying a Windows laptop now.
@@higochumbo8932 Generally, aside from a MacBook Pro, I prefer either a cloud setup or a desktop workstation, at least 96 G mem with an a NVIDIA GPU such as RTX 4090. However, if you really want to use a laptop, and your projects don’t involve large language models, I’d still recommend a laptop with an NVIDIA GPU and it is running Linux. Window is extremely bad for machine learning. If you really want to use window, I strongly recommend Window Subsystem for Linux - 2 (Window 11 has it by default)
@@higochumbo8932honestly as someone with a windows laptop with 64gb of ddr5 ram and an intel 13th gen cpu who’s finding it very frustrating working with windows and a cpu that gets very hot I my move over to a MacBook now I can buy a mid tier pro model with 48gb of ram.
However I’m waiting for the new AMD processors to come out so I can make an impartial comparison and educated purchase.
Although I’m done with intel 😮
@@bobhob35 What are you finding frustrating with Windows? Or is Intel your only issue?
Are you a mind reader? So glad I found your channel recently and this is exactly a topic I wanted to know more about. Awesome but trippy. Great channel by the way. Keep it up. You got my subscription.
Thanks for making a video on laptops for programmers, really appreciate your effort 👏👏. I have commented on a previous video asking for laptops for people like me and here it is .
I came across this channel just when I was looking into potential laptops for my university work and this is a gem! Wondering if you could also cover laptops catered to the drug discovery and bioinformatics sector, as they require mid to high laptop specs. I'm a student that does casual tasks but also working on some 3D visualization softwares, programming and may do slight gaming, but require a lightweight chassis. Are there any suggestions? I was looking into G14 2024 but don't know if it still has warm issues after the updates now. Thanks for the informative videos, love it and subscribed!
Tysm Josh!
I waiting soooo long for this video!
This type of review would be awesome if it included benchmark from LM Studio running local llm and Automatic 1111 running stable diffusion :)
Wow, super helpful examining real use cases in such detail!
Thx for the great coverage. My workflow involves running AI LLM Model locally as well as Data Analytics/Science.
I am considering either the latest Intel/AMD CPU + nVidia GPU or the MacBook Pro with M4 Pro to upgrade my current setup.
Excuse me, what did you choose?
I have to say this method of describing different individuals and their needs regarding their tech is a pretty straightforward and efficient method of helping put perspective in new buyers. You and your team nailed it.
Kudos Josh for bringing up this content!
Thanks!
Really good look at these devices for these workloads, but would’ve loved to have seen some commercial Windows PCs compared rather than their consumer counterparts.
Loved the Analysis ❤
Great Work Just Josh 👌👌
Nice video as always.
Maybe you can do a video next time about best 2in1 laptops.
Thanks for the Aftershock shout out. I remember when Metabox was the leading Clevo seller. Now days they are incredibly expensive.
When it comes to handling large datasets, the Mac is game changing. I would regularly analyze large datasets over 10K in size and on my 9th gen I9 windows machine, the fans would spin up like crazy and and would take forever just to even manipulate the files. Then my company got me an M2 Max, and the Mac was able to run the same operations without breaking a sweat.
I just bought a 10-year old laptop with I7 for just 50 bucks. I will use it for AI
I was so lost until I watched your video. Thank you!
Appreciate that. Glad we could help
This was the video I’ve been waiting a long time for!!! Thank you!
Thank you for this video! I was looking forward to it!
Can't wait for more options in the 2-in-1 laptop category...
i've tested the surface laptop 7 witgh snapdragon on power bi and the performance surprised me, even while being emulated from x64 to arm64. basically can sustain intels last gen H series processor's speed ...
awesome video. I am data analyst doing some data science tasks as a side job and I decided to go with pc instead of laptop. the computational power you get for the same budget is just worth it if you are working from home. nevertheless lenovo models were in my list when i was doing research + ASUS ROG Zephyrus
The question is which OS is better for Data Science Mac OS, Windows or Linux?
This could affect the purchasing decision
If you are connecting to a super computer server like in the 2nd example from Josh, then it doesn’t really matter which OS you use to initiate the connection. In my experience using both a windows laptop and a macbook, I can tell you that the macbook experience has been smoother and connectivity seems to be more stable overall. A simple example for why a macbook might be more preferable is to imagine never having to restart your laptop and as soon as you open the lid continuing your analysis work on a completely silent environment…
Good job Josh. You've literally saved my career 💯
Incredible video man. Subscribed!
Josh, just because most 13gen intel laptops are loud and hot doesn’t mean all are like that.
I have the asus zenbook pro 14 ux6404vv with 4060m
And yes i had to replace the thermal paste and i use a self made fan control-curve software i wrote in C#
Now i can exactly control the fans the way i want. Its a 13700H laptop, gets 108/1112 on CB2024 and GPU can sustain 125w.
Its very quiet unless pushed and now with my controls always cool to the touch!
By the way its for sale since i move to macOS. Region Europe
It seems 16" macbook has no real competition.
Lenovo legion 9i
@johnconstantine5228 Legion is not an option: unprofessional look, bulk, huge powerbrick, keyboard with numpad offset to the left, bad touchpad, screen is subpar. I'm considering Thinkpad P1 gen 7, but it has its own flaws (powerbrick, screen). And I'm not even touching the subject of working on battery. Unfortunately, hardwarewise, I don't see any real competition to Mac.
@@johnconstantine5228off center keyboard, like most 15+ inch Windows laptops.
This video was PROFOUNDLY useful. I did wonder why the Samsung Galaxybook4 Ultra wasn't included since it's light, has a long battery, and also a RTX 4070. I had been leaning towards one but since i'm in more of an "Agnes" role. Was that just an oversight? Or is it not recommended for a reason?
This video is a gold mine. Thank you Josh!! Will subscribe immediately.
Great video! I would include ASUS ProArt PX13, which is a unique 13" convertible laptop with RTX 4060 GPU, and Zephyrus G14, which is a good all-round 14" laptop with RTX 4070 GPU.
DS reporting in. Yes to Python locally, no to LLM locally. Love the channel!
Are you excited for the Blackwell equivalent of the Ada5000 GPU? When is it coming out
MacBook Air? Are you serious? This one should not be in the list at all.
Lenovo Legion 5 laptops are pretty good for data science 😉 (especially if you're training/running LLMs locally).
What do you think about the Legion Slim 5 with Processor Ryzen 7 8845HS, 32GB RAM 5,600Mhz Ddr5 and Nvidia RTX 4060?
@@leonardomurakami5125 I'm very close to pulling the trigger on that exact model. At the moment I have a Legion 5 gen 5 (Ryzen 7 4800H), and a Legion 7 gen 7 (Ryzen 7 6800H). The Legion Slim 5 will most likely be my next purchase (without a month or so). I just really hope it plays nice with Linux.
It's also gonna depend on if it's your home laptop or work laptop. Your full time job work might be all cloud sure. But what about your own projects? Part time freelancing projects. Passion projects. Start up projects, etc. You can't use the same laptop as your day job for those, so in that case you want your home laptop to be able to double as second job work laptop.
Can you also make a review of Mini PC with Performance? Like GMKtec or Geekom and so?
Unless my job was in finance I would never get a numbpad laptop as the shift to the left really screws up your typing and even after a while its hard to orientate your self without looking at the keyboard sometimes imo
Thanks for coming to our school today!
You got it Jack!
Nice video. I agree with most of the points.
Finally! Love this niche content!
hello i was seeing if you guys could make one of these on cyber security love the videos on different careers.
I recently purchased a Legion 7i laptop equipped with an i9 14th generation processor, an RTX 4060 graphics card, and upgraded its RAM to 64GB.
Based on my experience with model training, I would recommend considering an i7 14th generation processor paired with a more powerful graphics card.
During model training, the GPU usage typically reaches 99%, while CPU usage hovers around 25%, and RAM usage ranges from 20 to 30GB.
Good comment
How is its battery life
As a student in the industry I chose the value options (saving about 2K compared to what is shown here while still getting a fast 16GB gpu ;D) and got a Thinkpad yoga L13 g2 for 150$ and a PC with a 14 core xeon, 128GB ddr4 and an rx 6800xt for 600$ (both second hand). I also use Arch btw.
I'd absolutely love too see you talk about other laptop manufactures besides these giant companies. Im using a TUXEDO laptop right now and I feel like it can easily compete with most laptops you show in your videos. If you dont want to use LINUX you can still use it with windows (also preinstalled if you want it), personally Im running windows (dual boot) and it works perfectly fine with no way of telling that the laptoop wasn't specificly made for windows. TUXEDO especially has really nice laptops in the range of 14"-16" and >1000 and
My current laptop has a 3k 120Hz display, a R7 8845HS, 60Wh, RAM (6400Hz) 32GB, nice keyboard (imo) and a good touchpad. Its possible to shut off things like the camera in the UEFI settings and has a custom made OS, to efficiently use the provided hardware (TuxedoOS, Ubuntu fork). Its also possible to freely choose between a wide range of GPUs, CPUs, SSDs and RAMs, not on every laptop the same way but you always have a many options for SSDs and RAMs, so you dont have to sacrifice anything like you have to when you see a good laptop from eg lenovo, but for some reasons it only has a slow SSD and 16GBs of RAM... AND as mentioned before, Windows runs perfectly fine and I didn't feel any kind of performance drop on it.
Next 2 videos are with smaller manufacturers
Awesome video Josh!
What are your thoughts on the MSI Stealth 16 AI Studio? Core Ultra 9 185H, 32GB RAM (not soldered, can be upgraded to 96 GB), 16" 3840x2400 120HZ miniled display, 2 TB storage, RTX 4080, 1.99Kg weight - this version in Europe costs an amazing € 2699. There's also a version with an RTX 4090 for € 3199...
Numpad is the most important requirement whenever I buy a laptop, a 16'' display is rather a nice bonus. I do some basic ML/scientific calculations using Python libraries, so for me CPU, especially single-core productivity is super important. I bought a Lenovo ThinkBook 16 G6 IRL, it's perfect for my needs.
- i7-13700H up to 5 GHz
- 32 GB DDR5 5200
- 1 TB NVME SSD
I don't know why you pay so much attention to screen resolution, it's important for large displays >21'', but for laptops, it seems overvalued.
If I use the cloud for training do i need nvidia minimum 4050 graphics in my laptop
Could have mentioned the g16 amd 4070. While not as high ppi, it has a very fast refresh rate and brighter screen than the p16. Also no screen door effect.
Great video!! Can you make one for engineers?
Would the acer predator helios neo 16 and comparable laptops like legion, rog, etc do well with AI/ML/DS/DA tasks? The specs seem on par with the 7i and 9i mentioned. I'm wondering if its moreso the portability of the yoga laptops that onto this list or what the biggest defining feature that made them a favorite.
I think my use case will be more like Agnes. I’d like to run LLMs and perhaps other multimodal models (and perhaps Stable Diffusion for imaging) locally. For the purpose of not having to pay per token and for privacy concerns with Cloud services, I really like to run both the model (training/inference) locally. I heard that while inferencing is not that memory intensive, it needs high memory bandwidth (256bit memory bus, 576GB/s bandwidth on the 4090) I need something powerful enough to fit the models like a Llama3.1-70B on to the GPU. I’m looking to fine tune with QLoRA and potentially incorporate a RAG with a vector database. Would a mobile 4090 with 16GB VRAM be enough, or will I need to only look at GGUF quantized models and reduced FP4/INT4 4-bit precision, and paging optimization to fit everything onto the GPU. If that’s the case, could I go to a 4080 with 12GB VRAM or 4070 with 8GB? I also need to be able to run multiple VMs the laptop and maybe setup a kubenetes cluster for testing purposes… maybe 2-3 nodes. I need to do it all locally on the laptop and bring it to meetings where the company is air-gapped where I can connect to projector (via display port of HDMI) to demo everything and show my Jupyter notebooks.
This is why I love your videos. Great reaction!
Josh 2:20 the laptop is a Zephyrus g14 not a Yoga pro 9i
Hi nice review, can you do a comparison for virtualisation? probably a bigger market than datascience. thanks
Another great one, nicely done Josh! Can i pick your brain for just a bit...I'm looking to switch to an M4 Pro once your reviews are out & provided everything is fine. I currently virtualise windows on ARM on my mac & run Power BI Desktop therein, whenever I need to work with it...16GB is jusssst about right but teetering on the edge for sure....what is the best value model of the M4 Pro you'd recommend? I'm fine spending a little more, but wanna get the best therefrom...
Dell Precision, HP Zbook and Lenovo ThinkPad P series are the best . U will appreciate a Dell precision when working with Deep Learning model. ....
Infact any mobile workstation is better ...
Dell Precision or HP ZBook with a Nvidia RTX 4090 GPU is currently impossible beat for machine learning on the go. The GPU power is so immense, that NASA uses both of those workstations.
The closest thing is M4 Max, but that doesn't have the GPU power to compete with those two. It is lighter to carry around and has better battery life if that counts.
On 12:29 the wrong laptop is highlighted in the graph
Ownsome video, can please do one for software engineer as well.
Here you go: th-cam.com/video/STAx2JgHhrM/w-d-xo.html
Exactly what I needed
I noticed you didn't mention the Legion 7i here, is there a reason it wouldn't fit this category?
Dont spend extra on screen quality, battery, build quality as most of the time the laptop is used as a deskop. Spend it on cpu/ram/gpu.
Hi Josh. I love your vid. You are have a lot of experience with computer. I got a question. Do you use stil these thick computer books to learn new computer knowledge or is it all online today. How do you learn CS today?Thank for your work
Please tell me what is the best laptop with the minimal price to use spss?
Josh, you can get 96+ GB of RAM in P series, some have 3k display 14.5-16 inches. But their Dgpu choices are limited. What's your opinion?
Any ThinkPad that can be added to this video??
Thanks so much for this video. One question: is there any reason you like the Lenovo Legions instead of the Lenovo Thinkpad p16? Thanks!
IDK feels pretty stupid buying a laptop with a 40 series card when we know that the 50 series is coming out within the next few months
One callout about using Macbooks for ML and AI is that they don't support current model quantisation methods. In some sense, you're trading off half of the Macbook's high RAM compute by having to load models in at double the precision (or more) compared to loading them in CUDA with BitsAndBytes backend.
Currently 8bit (or 4bit for very large models) is the precison sweet spot. Yet on Mac the lowest precision supported is just 16bit (half precision).
Great video though. I personally use the MacBook Pro M3 and the G16 HX370. Very happy with both, though I generally do cloud training.
Could you ping me on our email address about this? I'd like to discuss it a little more
Finally someone made it 🙌 ❤
Unfortunately nothing comes close to the battery on a Mac. I just made the expensive mistake of testing a new Intel lunar lake machine with Linux I had to charge it twice a day for 2 days before I decided to return it. 😅 $1000 mistake losing 20% on the restocking fee 😅
Fyi, I tested the yoga pro 9i
Look into thinkpad x1 carbon. Getting them used is cheap Nd battery life is great.
I am currently seeing another round of creators-sponsored "reviews" for X-Elite laptops and they are still quoting 24-28 hours of battery life, trying to fool people. I have tested them all and none come close. I would say 8 hours of SOT doing multitasking work, not watching movies. I get 50-100% more SOT on my Macbook. I suspect your example with the new Intel is more informative than the recent wave of sponsored ads by YT creators quoting ridiculous unreal numbers. As you say, Macs are the best for efficiency especially when multitasking.
Hey Josh, i am a AI developer student from Europe. Have you any recommendation for a 14 inch system?
If so, i myself like to work with mlflow operation in docker images. Powerful enough for PyTorch and Tensforflow and Keras DeepL operations.
Awesome as usual
Could u make a video about laptops for cybersecurity? Thanks
What does it mean to say programs like Tableau (for example) do not “run natively on the hardware” but they do “run under emulation” ?
Which do you recommend then for MATLAB??
Hi Josh, thanks for your great insights. What’s your opinion on the Lenovo X1 Carbon Gen 12 and Dynabook Portege X40L-M?
May you do cybersecurity please?🥺
It seems that Yoga Pro 9i is most versatile cost-effective choice of them all if you consider windows. You just need to import it with 4070 😅
Matte screen allow you to see your content without cranking your brightness too high which might hurt your eyes, right?
I am going to college next year and I plan on getting my masters. I want to learn cyber security for my undergrad, and then machine learning for my masters. When choosing a laptop, there are lot like you have mentioned, but I was mainly looking at the 16 inch macbook pro m4 pro chip, vs the Lenovo Yoga pro 9i. I would get the macbook, but the problem is the operating system. I know for buisnesses they mainly use windows and not that much on Apple. I heard that they are moving a little bit with Apple. Do I get the Lenovo Pro 9i, or the 16 inch macbook pro m4 pro chip?