@32:28 hearing him say this is rather insane and actually infuriating. Because developers are rarely ever given the authority to decide on design. There is no "attitude among developers" that they can stop once they "get it to work" . The business is the one deciding "hey it didn't work last month and now it does, and our roadmap says to move on to these next features so that's what we're doing". Among developers, all I've seen is frustration that they are not permitted to do things well and are forced to leave half-finished work. Famously Brendan Eich's JavaScript prototype taken by the business and shipped before he ever said it was ready. Billy Hollis saying things like this makes me suspect he is a coward, focusing on blaming developers for the behavior of executives, because that aligns well with his paycheck: the businesses he works for also think their bad outcomes are the fault of developers, and I'm sure are happy to hire a consultant who seems to have the same attitude. In my experience, all of these problems from "wall of data grids" to "my job is done" are decisions of executives, not of developers. Developers always push back as much as possible, but they ultimately do not have the power here. If the executive wants their stupid page-flipping animation, they will get it. And they refuse to listen to developers who tell them it's a bad idea.
yup, I've worked in places where common discussion-topics among the developers include: - are we going to sneak this piece of actual quality past management? - this issue is known and solvable, are we going to apply the above to sneak past a band-aid, or keep our mouths shut and hope the failure *we've warned them about* makes a sufficient impact, we'll actually be allowed the time to fix things properly - don't *ever* inform management how fast you could put in a band-aid solution, should the day come this becomes a 'production is completely shut down, customers can't do anything, call people out of bed if you need to, this needs to get fixed NOW', in *any* situation other than 'on that get called out of bed phone call' they will either use it as an excuse to not fix something, or make that the estimate of the 'proper' solution, to be released next week/month. - and of course the common: if you have an estimate, double it. In the (uncommon) scenario no disturbances come up, this can be used for much needed cleanup.
So true, by the time it gets to us (developers) the design has already been done. It's our job to implement and make it performant. I personally think it's why we have the millions of abstractions, the layers, the microservices, etc. I believe it is as a direct result so that we can mitigate bad design quickly, and get the next bad decision out in the wild. And ultimately, the whole thing is driven by the next dollar, the shareholder, etc, not the end-user.
having worked with wall of data grids, they aren't as intuitive to learn, but for someone who spent the effort learning them, boy can they be efficient.
I have a feeling that the F35 issue is more complicated than the speaker makes it seem like, but in general i really liked this session and felt like he spelled out a lot of things that i've experienced in software development before.
Any expectation that one can sufficiently analyze all the causal elements of a trillion dollar failure in a single live stage presentation is materially ludicrous.
@@JeremyAndersonBoise that's a long way of saying "you're stupid". i didn't have the expectation that they explained the case in all the details, the other extreme is just as bad. i guess i should have said "i got a strong feeling that the speaker oversimplified the case a lot more than needed and it felt one-sided" but if you're expecting that from me i think your expectation that one can sufficiently express their thoughts and feelings to your levels of nuance in a single youtube comment is materially ludicrous.
The moment I read touchscreen in a fighter jet I felt like it's doomed to failure. There's a reason switches work and this is a situation where you have to be sure what you just did. I can understand my phone (really just a portable web browser) can be touch. I'm asking it to be a general purpose computer. I guess also not to mention how horrible modern web site design is with its lack of affordances.
He is repeating a myth, spread by russian probaganda. He showcases that he is a person who does not check is sources, and instead just blindly repeats what he heard others say.
Mild objection to the idea around 32:40 that this is developers' fault. More often, management won't let them spend time on doing a good job so they have to bang something basically functional together quickly, or it's given to the guy who's available or cheap who doesn't know how to do anything better.
Was thinking the same. Though it doesn't change the core point of 'it is theoretically possible to work with this' == 'good enough' is a bad practice, the good alternative just costs more time. (which will earn itself back, now if only sales was capable enough of convincing the client of this fact) The decision to 'meet more deadlines' over 'do things well' increase exponentially with additional layers of 'squeeze more deadlines out of those below you' in the company. And indeed it is pretty rare that it comes from a decent developer, not pressured either by a very hands-on manager trying to ram through every project asap, or by an aggregated-deadline-structure intended to do the exact same but giving the manager in question 'plausible deniability' on any single project messed up by this 'move fast and blame others for what it breaks'-approach
I was more or less the one guy who cared about UX at (company name undisclosed). At some point scrum was introduced. Nobody could figure out how UX fit into the rituals, and perhaps management were able to predict that I would have pointed out that scrum wasn't working (because yeah, I was pretty vocal when voicing complaints, especially when I was right), so I got bumped out of the scrum teams to do DevX stuff instead. Eventually management decided that DevX wasn't worth any spend either, so now I'm out of work. The glib interpretation of this flow is that to have job security, you're better off just doing the grunt work of implementing the features with zero regard for UX or true quality of any sort.
Thing with the F-35 is that the airplane is mind bogglingly ambitious and forward thinking. Unlike literally any other plane that exists. The trillion dollars number is also extremely misleading, its a total lifetime cost of procurement. The actual cost per plane is nearly half that of the F-22, to my knowledge the F-35 is the least expensive stealth aircraft in the world, and the most capable by far. The title of the article itself is misleading, the next gen plane is not a replacement for the f35, but rather a replacement for a whole class of roles which the f35 is overkill for. The existence of the f16 doesnt make the f15 a failure. The f15 is the most successful air superiority fighter in history. But its overkill for all the uses the f16 gets put to. Same with the f35. The point about touch screens in a cockpits is something, but the actual ux challenges of a platform like the f35 are bounded by an absurd amount of complexity that flip switches cannot handle. My understanding is that a lot of the cockput experience problems were due to the helmet system taking a long time to reach maturity, and this meant the contextual flow was jank in the early phases of rollout. Screens can context switch and toggles cannot. Physical controls are great however, so there is something to the case, but to actually make a point about good design, you do need to take the constrainst seriously, and nothing serious was addressed in this talk imo.
A key point remains, there was a bunch of 'wouldn't it be cool if' over 'what does the user actually want' design involved... not all of it sure, but definitely some and as 'an intention-grabbing example' it did exactly what it should be doing. case in point: "contextual flow was jank in the early phases of rollout" in 'mid/late phases of user-testing' this would be acceptable. the 'usual' approach of 'if your version ends in .0 expect problems that will be resolved by the next update' is one of those things that should not happen here. the 'detail', the cost of an early-in-version-lifecycle-bug in this use-case is death, should have been one of those 'don't blindly copy over requirements from unrelated scenarios'-triggers.
@@backgammonbacon No, Neither the *key* point of 'if the 'wouldn't it be cool if'- argument is your only reason for a design-choice is probably a bad design-choice.' Nor the 'here's an attention-grabber example, to get/hold the audience's attention' - placement in the presentation Is in any way invalidated by any 'late testing-phase'-resolution to the 'they didn't bother to check what feedback the users required before they started implementation'-problem. A 'less than perfect implementation'? Probably. A 'complete failure invalidating all key aspects'? No, absolutely not. (Whether those last 2 sentences describe 'the plane' or 'the presentation', i'll leave open for interpretation)
I saw this talk and immediately recognised one of the patterns that we also did at my previous employer (and, surprise, it failed miserably). Thank you for the talk! I enjoyed this recipe for micro service disaster!
yes, i know! the good old "almost no design" pattern! although my favourite is the "alibi design" pattern. there is some design, it may even be okay, but after two weeks the software has no resemblance whatsoever do it. but: "hey, we have a design!" oh man, i'm so glad i'm not working in software development anymore
I love these types of talks but feel many of them fall short of answering or addressing the root issues. For example, the Rolling Stones didn't create music using Agile methodologies. No one told Keith Richards he had 2 weeks to refine a melody. Great artists didn't give a fuck if people were consulted or focus groups formed. IT JUST HAPPENED!!! I'm not dismissing the speaker's content. It was decent. But highly creative people just make things happen and then the world takes notice after the fact. Oh... and btw.... my wife is an ER physician. She and most all doctors will tell you their available software sucks beyond belief. So this notion that he "fixed" all problems related to radiology I find highly skeptical.
If I learned one thing working for the last 4 years as a software engineer, it's never our or even the design team's fault for the most part (like 95%). It's always the management. Always. Those people demand unrealistic feature sets to release based on their quarterly analysis and keep navigating ship from one feature to the next without any weightage given to the engineering/design team's opinion until the ship is wrecked. Then they look for another team trying to replace us. Hey but, it's your job as an engineer or designer to guide them better, don't you know? It's not. It's not my job to do politics wasting my team in useless meetings and scrum. If the management doesn't get it in one time, they should resign after the ship wreck rather than telling others what to do. And why are you even there to begin with if you have no domain knowledge? Let it be just us, designers, and users. Get out of the way! Rant over.
I almost lost a finger because a touch screen didn't respond when I touched a button to stop a motor. I hit the button again, because, well I really wanted that motor off. Ok, it worked the second time-now to get the job done. Unfortunately, the first touch *did* work, but updating the UI to reflect that the motor was off took about ten seconds. By the time the second touch registered, turning the motor back on, I had by hand in the machine. That was 15 years ago. A few months ago at my current job, there was a software update that changed the UI from a "safe" version (on button is on a different spot on the screen than the off button) to the arrangement that caused my accident (on and off buttons are the same spot, just colored red and green - which BTW, causes another problem for color blind people)
Just imagining all bulbs in my house were "smart" bulbs. Every time I enter the basement or bathroom shouting "light on" or "light off" instead of flipping a switch. No @@myne00
my company first made an UI for a browser and then decided to limit the support to a single one browser. now we have the worst of both worlds - no hotkeys, no windows, no drag-n-drop, and no easy access with whatever browser user already have in the system
The thing I really dislike about this talk is how he assumes developers have to be the one to do he UX Developers should not and must not be tasked with doing ux. "[...] developers tend to be rather code Centric [...] and sometimes they [...] blow off all the other things that they really ought to do" Is a fundamentally flawed statement. Developers are hired to write code. Any company that tasks developers with doing more work than just writing code will have the employees complain about having to do too much work. If you want good architecture, hire a code architect. If you want good testing, hire QA. If you want good communication to users, hire support and if you want good UX, then hire UX. Don't make a developer do it and then be angry when they can't do it!
Excel is the world's most successful functional programming language. It's kind of weird that all implementations of the interpreter that I can find have data grid GUIs rather than, say, a console interface, but Excel is still the most widely used functional programming language.
Texan-sounding fellow starts with what is essentially, "The government can't do anything correctly, so what else would we expect?" Cue my eye-roll. Also, he is comparing defense contractors designing specialized equipment in essentially a captive market with software development for a much more competitive field where users can vote with their feet. The rest of the talk seems reasonable, though. Ca. 21:07 on Hick's Law shows a reasonable UI for selecting among a set of nested choices, so long as the choices make sense to the intended user. It is complicated, but it is more useable than the flat hierarchy/icon-based UI shown on the previous slide.
Ah yes, Windows 8... It was bad. Plain bad. But at least there was the idea behind it that it was designed for tablets. But they slapped that moronic interface on Server 2012. BECAUSE I LOVE ADMINISTRATING SERVERS FROM A BLOODY TABLET.
16:18 Everyone is always shitting on Swing for "not having animation", but we were doing animations in Swing, and even AWT. It's not that it can't be done, it's just that it requires writing bits of the framework, or these days, finding a library which does the repetitive bits. I have recently learned how to do animation in Compose, and aside from it being very declarative in Compose, I feel like the way it worked in hand-written animation code was sometimes simpler to understand when maintaining the code. The way animation code works in Compose really feels like programmatically building up collections of key frames, which I guess is nice and declarative, but sometimes you just wish you could bring up an animation editor, edit the damn thing as a resource, and save the file, like in Unity or Godot Editor. Maybe it's just me though, because I've been doing animations since the AWT days.
Now with Java on serverless slowlyness costs the company money and they allow to write the apps correctly and fast. My own desktop and android Java Apps were always snappy.
“Wouldn’t it be cool if” Is why a lot of us became programmers. You sound like the fun police. I do completely agree that user convenience trumps developer convenience.
There's a key point missing though: He's not saying “Wouldn’t it be cool if” is a reason not to do something. It's the reason to ask/check for other reasons to. (assuming costs are involved) “Wouldn’t it be cool if” is an invaluable tool in either the brainstorming phase, or for picking hobbies. (how many of us became programmers) It should however not be considered enough for the refinement phase, or for picking a career. That's where 'it would be a good selling-point or now that I've actually thought about it for a moment here are 5 more benefits (with the following fine-tune potential)' or 'given the trivial scalability, testability and reversibility (in 90+% of cases) it's such a universal solution -> it provides variation, job-security and decent pay' comes in
My interpretation of what he saying is “wouldn’t it be cool if” is antithetical to good design choices. And I personally think it is a key first step in all designs. Then he bashes on 3-D explosions that I’m sure tons of kids love to play with, and it even makes him smile while he’s doing the talk, he says no one needed it and it’s a complete waste of development dollars. That offends me. It’s the little things in life that make it worth living.
@@Diamonddrake that "my interpretation" might be key here. I agree the introduction-story of the 'wouldn't it be cool as red flag'-point could have been clearer on this, but in the closing summary he unambigously states it's 'doing the thing *only* because it's cool' that's the problem. As for the '3d explosion'-thing: - is it an issue it was created at all? No - might it be an indication of an unhealthy priority that, the admittedly entertaining little toy, was the *key* presentation, at a major event, from a 'both top 3 OS-vendor and top 3 corporate cloud service-provider' size party? Probably.
The WORST UI decision ever made was making buttons "dynamic" and making tabs resize instead of statically sized but multi-row. On Windows XP you could mentally queue up commands to perform because you already knew after you clicked something where the next button you would click was going to be. You could use the OS faster than it could process your commands because it was static like that. Then they made everything animated and dynamically positioned, so now you have to look and see where the next button you're going to click is. Incalculable wasted time over every user of Windows.
There's another side of the "cool" coin - great UX ideas getting shot down because they may seem fancy or extraneous when sometimes they're the details that breathe life into an application (versus selling software that just "tries to get a passing grade"). Seems like this is kind of a long-winded "start with why" point being made.
5:47 "Put yourself in the mind of the user" That won't work. The people making the decision truly believe their solution is what the final user needs/wants. Any attempt to think like the users will just validate their decisions.
There are real design concerns physical switches vs touchscreen, as400/sales example, and ms toy explosion videos,, but he keeps stepping outside his area of expertise and saying seemingly stupid or inappropriate things that taint the legitimate message. Like when he dumps on the government he isn't professional, appropriate, or logical, and he does it with "I imagine they had a meeting and said..." a lot instead of real evidence and ignores that governments must operate differently than business because they do things that businesses can't or won't. There are real complaints to be made about the F-35, and this person has only a superficial grasp on them for some who spent so long talking about it. Then this talk about thinking about design and he leaves his notes and email up instead of designing his workflow and have it ready with a single alt-tab or at least minimizing the other windows. Maybe this guy was serious once upon a time, but he seems like a clown now.
Thank you for saving me from writing an entire comment as you covered my objections with the talk eloquently. Showing disdain for all government projects when, like in private industry, some succeed and some don't. Talking about Microsoft is if it's some clown college that didn't invent, I don't know, the azure cloud.
We don't do computers with touch keyboards. Mechanical tactile feedback is important for efficiency when your eye focus lay somewhere else. On mobile phones it works because you can look at the keyboard while typing with two thumbs. On a computer my eyes are on the screen, the keyboard needs to work blind, and the feedback from a keyboard is more valuable than the screen prompt. Especially on a fighter jet you need to know if you hit the button without looking.
I don't follow the question at 1:01:27. I gather it's about why his website is the way it is. It looks fine to me. The problem I have with it is that it wasn't served up over HTTPS. Blog is shorter than I would have expected. There's only one article and it's a 6-year-old white paper.
This is only 6 months old but I can't find the "edit more" in photos anymore. I immediately tried to add explosions to any videos I got at hand. Edit: Oh yeah I have W10 so it might be a W11 feature, bummers. I find the first mistake of the design team was not interviewing repeatedly fighter jet pilots throughout the development: first to gather their opinions about what they want from the jet and the interface, and then pass the later ideas through them, getting feedback at every point of the development pretty much. You can't sell a product as a black box that you know nothing about, about the end user or understand any of it yourself. Regarding the tactile buttons vs touch screen, when I had my Nokia cellphone back in the day, as a teenager I could ride a bike while texting with the phone in my pocket. Today that's a guaranteed incomprehensible message and potentially traffic accident. Most of the time even when texting in an unstressed situation I find myself cursing because apparently my thumb is not touch screen compatible, either not activating it or hitting wrong keys repeatedly. Pressing small interface buttons on a smartphone is impossible to do right the first attempt. And I bet everyone has noticed that difference. What about touch buttons powering on and off with just an accidental swipe and even sometimes from random objects that aren't your hand. Now does that experience sound like the choice for a situation where reaction time and life/death are the top priorities? 8 times the code also for some reason rings an alarm bell to me. I understand more complicated stuff requires more complicated code (at least sometimes), but on a fighter jet like application my philosophy would be that you want to minimize the opportunities to fail, which would to me mean minimizing the amount of code (that inevitably needs to be analysed and debugged for possible issues that _can't_ happen). Oh how I've hated all the apps that moved to browser interface. It is so often just a compromise that you start to shiver just because you hear it's browser based, before even seeing it. I understand in a thing like SAP where you'd do remote access for licenses or whatever so it's just practical to do it that way, but browser environment is not the comfortable environment to use an app. Like first of all it usually robs all the key binds and doesn't follow your normal quick key patterns. I love the "red flag phrase: the users just need more training on that". I've been telling anyone who agrees to listen in most of the circumstances it's relevant with, that at least from my engineering perspective, if the user has any ambiguity, any confusion, if it's not intuitive right from the glance how they get their peak priority done in the thing, it's not good design. It's particularly obvious to me because when I open some interface like webpage or instructions or something, I'm not opening them because I want to learn it through and through by heart. I'm usually at least slightly annoyed, frustrated, upset. I have a problem and I want to get a thing done. That's when I'm really unable to read a long sentence or a paragraph of sentences, browse through menus and whatnot. It has to be obvious or I'm gonna miss it. And if I in fact am in a more relaxed state, have time to look it up etc, I'm just left thinking "well that's shitty, I've spent more time trying to figure out where to find that one little thing that should probably be the one of the most likely user need, or at least more critical one if not the most common one, that it should be glaringly obvious where to find this information". Usually webpages and interfaces are full of unnecessary or once in a blue moon stuff that for some reason just was deemed to be one of the more core features or options. Basically it's like being in a school that has maths and physics (or coding). Most of the people are not too dumb to understand how to solve the problem. Most of the people are just really frustrated about the material that they can't figure out how to apply to their specific problem. Because it's confusing and the rules aren't obvious. I feel like coding is one of those things that particularly shines on this. There's all the material, either documentation or the course material showing how to do stuff. Then you engage in a task you need to build and you figured out the design patterns etc but you spend forever trying to figure out how in the hell the syntax and interface for that thing works and none of the material or documentation has that. Rust is known for having pretty good examples and material and even rustlings, a training system or whatever. But then there's every once in a while an exercise where you think you're doing just a normal application, putting things together and using things nested or something. Only to finally find out that for one of the features you need to do a syntax of {} or something where none of the existing material you find online shows using those instead of () for example with the feature and unless you use those, it doesn't work. Everything else in your whole code is fine but you're ready to give up because nothing makes it work. Except that thing where everything else uses (), heck, even that feature uses () normally but now there's a hidden rule that you need to use a different thing. It's not even a good learning event because you just discovered a magical artefact that is vowen to be kept secret from everybody and it's a hidden extra that you have to find for this very specific unique case. For some reason doing this kind of design is close to my heart where you approach a thing saying "no, that does not work, people are too stupid to use that, no you can't do that because it allows people to make mistakes, no that doesn't pass because it requires learning and knowing about this and that, no that doesn't fly because it's not explained in unambiguous and short, accurate terms" and try to iterate to something where you just don't need a manual to use it even if you know nothing about it. I think some manufacturing or product design philosophies follow that somewhat, or tell you to make intuitive user interfaces that either take advantage of a concept that everyone is familiar with, or if a design is similar to an existing familiar design but doesn't work that way, you should change it enough to not make the user not think of that application. That's what I dig. Trying to make everyone's life easier. In fact Microsoft was sort of right about "folders are where apps go to die". Windows needs folders to hide all the useless crap it ships with so you can have like a start meny or toolbar with less than 10 of your favourite apps.
48:10 A 90 minute meeting on a new feature to be built, and in less than 10 minutes they start talking about the actual functionality to the point that dreaming off into code becomes an option? Where can I join?
I always get mad at the windows 8 example. Because it was really nice. The issue was not UX but the inconsistency and the fact that half of the things were Windows 7 and the other some new places moved randomly. I didn't see it so cool, but the tiles were quite useful and adding productivity. And what's up with the no folders? I always had folders or was it in a very early version for insiders?
My main beef with win 8 was that in 8.0 control panel/settings didn't show up in regular search, you had to go to a separate tab. 8.1 made it one universal search, and I really enjoyed it. Now, in win11, I miss the tiles - in fact, I miss getting to decide for myself how many apps I wanted to have visible on my start screen/menu. I preferred 8/10s icons/tiles.
i think that part of the problem was that the screen completely changed when you entered the start menu. it confused users because the context of what they were doing is completely lost, and it was not immediately clear how to get back for example
@@xybersurfer Not to mention the focus on "everything maximized to the whole screen" (start menu included). It makes sense for 99% of the use cases, and much of the rest is handled by putting two "full screen" windows side by side... but it fuelled the whole "Windows no longer has (overlapping) windows" fun :D It doesn't help that it clearly was a design that was honed by small handheld devices with touchscreens... and by that point, those were _heavily_ associated with horrible anti-consumer practices.
@@LuaanTi The issue was: The 'lets make 'windows for phones or tablets' the only way to do things now, even on _actual computers_ '-design decision. Rightfully backtracked into 'let's make it the default', within a week. Maybe the massive rush on 'and how do we switch away from the default' was an overreaction-driven example of 'users with status-quo-bias'. I'd go with probably, but it was reasonably predictable. Now if only the release-process had even a fraction of the design-attention they (claimed to have) put in the ''windows for phones' is a great OS for desktops, trust us'-itself
At around "46:20" you say "It is true that users can not......" can I add "that it is also also true that many companies don't have people with different perception skills like dislexcia on there UI design teams or as a dislexic my self they seem not to because so many new UI's demonstrate that".
If a touchscreen is added to a device, it will eventually drive out other input mechanisms. Not entirely coolness - anything with moving parts is expensive.
In the given example of an F-35 I call BS for the expensive part. It was probably about a 1000 times more expensive to write the code, debug the code, get the code integrated, have a user interface on a touch screen, etc. than to add a manual switch. The development and integration cost is only recovered if you sell a few hundred thousand of the items you're putting that into.
@@guidon.5413 Why is noone able to think about it for a minute. How are you going to implement a GPS map with manual switches? Of course you need a touch screen!
@@johnflux1 - no, you don't. You might not have thought this through. A) You don't need a touch screen for a GPS map. If you think you need this, think again. GPS devices without touch have been around forever. B) You might want to do a touch screen for one specific purpose and nothing else. The person not thinking is you.
@@guidon.5413 "You don't need" - lol, so you're arguing for having a touch screen for some particular purpose, so the hardware is all already there, but then actively prevent them from using it to do basic pan and zoom? Is that really what you're pushing for?
He compares F35 cost to Poland's ANNUAL GDP. That's what's called a 'flow variable'. Has time in the denominator. F35 cost is not. It's like saying 'the Burj Khalifa is 828m high - that's more than twice the speed of sound!'
If you were doing math then you're correct, but he's just using it as a language tool to aid understanding. He is essentially saying "this project cost more than the GDP of a reasonably sized country" to give a sense of scale. GDP is by its definition annual. TL;DR? don't code it like that but it is reasonable language use.
@@alan_davis you are entitled to your opinion but I disagree, it's not maths, it's something you are taught in physics first year, and if you think *really* hard you will understand that the most famous equation of C20 e=mc2 was easier than it looks, because the units on both sides must match. Also in ft comments this topic was raised and highly recommended in the comments section.
I think id have left this one before the half-way point... while it's got an important underlying message (UX is important / ask "why?") it was long winded and not very engaging. You could do this talk in 20 minutes with 2-3 key examples (none of the ones from this) rather than deciding to claim the F35 project cost was based on the choice of a touchscreen... which I suspect lost most rational viewers immediately.
The topic is interesting but I would expect a bit better slides from a UI design expert. They slides, full of text, in the same font and appearing at once are very distracting.
Windows 7 appeared so ugly to me, I was happy with Windows 8 and then 8.1. It was far from perfect, some design mistakes not fixed yet, but for me it signified the start of change from unnecessarily fiddly and decorated interfaces. The transition from WinForms to XAML-based UIs had to happen, but the new UI language wasn't figured out yet. I think it is comparable to the evolution of Android UI that was happening during approximately same time period - a lot of trial and error was happening there during first few major versions.
This is possibly the least well argued case I have ever seen on NDC, and I've seen some shockers. Now, I'm not a huge fan of the F35. But I would never premise my whole argument on the completely made up, uninformed guess that the F35 designers based their design on the phrase "wouldn't it be cool if", and then draw a whole bunch of conclusions from that. I would never be so daft as to think that I could draw an analogy between "not using the best jet available all the time" and "not using the best software available all the time", because of course you don't use the jet with high operating cost for operations that don't need its capabilities. That sort of operating cost per flight hours problem simply does not apply to software. The analogy is boneheaded. Why would I keep listening to a presenter who introduces their argument with such stupidity?
I'm willing to assume he's knowledgeable about his area of expertise, and not knowledgeable about defence procurement or finding suitable analogies for conference talks, and give him some benefit of the doubt
It’s difficult to have it both ways. He crafted a speech/presentation choosing a specific analogy to best convey his point. If he’s not an expert in that domain or doesn’t understand the analogy he’s creating, it says something about his expertise as a presenter and should call into question his assertions. I’ve found Billy Hollis a bit lacking in his keynote from ndc minnesota 2022 as well. The talk was pretty hand wavy and book he recommended as gospel to architecture design, “righting software”, was not a great read either. It could be different strokes and all that, but I share the opinion that Billy’s talks seem to be more style over substance.
Reminds me of the story about the R&D project to design a combined Shovel/Bayonet. "Wouldn't it be cool if..." is just a much catchier title than "why you should always have reasonable expectations about anything new".
It was about twice as long as it needed to be, but damn some of these comments are really harsh. Most people are not as competent in public presentations as they are at their actual careers, so I'm willing to cut the guy some slack. I know I'm much better at implementing advanced cryptography from maths papers than I am at presenting those concepts to non-autists, for example.
it's kinda hilarious that they even imagined touchscreens in a fighter plane. it's even hilariouser they actually implemented it. although it's also kinda depressing because it shows how little contact people have with reality.
That sounds like you've got no contact with reality. Things like GPS, waypoints, seeing the battlefield etc are obvious candidates for a touchscreen. If you want to see your friendly troops and detected enemy troops and known AA emplacements, you're going to want a large touch screen. Can't you see that?
@@f.d.3289 It's sad that you can't think. You said it's hilarious that they even imagined touchscreens in a fighterplane, but now seem to admit that they are needed. And look at the photo - there's still traditional plane controls right there. How did you miss that? What plane control do you think they are putting in the touchscreen that shouldn't be there in your expert opinion?
Clever Hans seems to keep on being relevant, especially in the area of machine learning. There was one attempt to train a system to identify malignant skin growths... it would flag anything with a ruler next to it.
How is that a design failure? That's the sort of mistake a developer makes, then you go "opps", and crop out the ruler and continue. The sort of thing that sets you back a few days.
@@johnflux1 My point was actually that Hans is applicable to machine learning (in addition to design) in terms of reacting to / training on the wrong thing. It's "why does this horse keep on being relevant to different fields" situation. And I believe the development timeline was somewhat longer when that happened
Yeah, that's not because of the touch screen. That's because you're not paying enough attention to your driving. If your phone was operated by a bunch of switches and dials it wouldn't make your driving better!
@@johnflux1 --- EDIT --- I just realised you might be asking how they would actually see the maps without a screen and are using the term 'touchscreen' as a general term for a screen. We are talking specifically about the touch interface not the fact of having a screen. Fighter jet pilots have information cast onto a lens that sits over one eye. They develop the ability to focus one eye on that while the other eye looks where they are going. Quite interesting how the human brain can do that. But even if they required a separate screen for maps and other information the idea of them taking their hand off the controls to scroll over a map with on a touchscreen - with all of the inherent difficulties associated with such an interface - while having to make split second decisions at supersonic speeds is absurd to me. --- END --- You honestly think that there is no other way to access maps and GPS than with a touchscreen? Are you under 25? You must be. In any case, anyone with any kind of imagination can come up with a better way for someone to access important information while operating a fighter jet with split second timing than a touch screen. Buttons and dials don't get moody if your fingers aren't moist enough at the tips, they don't stop working because there is too much gunk on them from you fingers, you can operate them WITH gloves on and it is more difficult to press the wrong button when you have actual physical buttons rather than virtual ones on a touchscreen.
I strongly disagree on the idea that iteration does not lead to good design. The opposite is true: outside of trivial problems, iteration is the ONLY source of good designs. Let's look at the best of the best, the iconic strategy game UIs: Starcraft 2 (LotV), Supreme Commander (FAF), and Factorio (1.1). These are amazing demonstrations of visually doing complex things fast. We know the history of these UIs. We can trace back step by step until we find WarCraft developers copying Dune 2 around 30 years ago, even using its graphics as dummies. Innovations and changes were introduced slowly, often individually, over decades. Never was there a prior, grand design on this scale! Never in their wildest dreams did anyone designing Dune 2 think "Alt-Shift-4 should remove the current selection from other hotkey groups and add it to #4." Dune 2 couldn't even select more than one unit! Playing Factorio, I now build belt arrays like this: Shift-6 swaps my primary hotkey row to the contents of a blueprint book that uses global grid alignment to... you get the point. Early Factorio had no blueprints, only one hotbar... and that thing was actually a Minecraft-style inventory bar! Neither the entire interaction now, nor even its reasoning made any sense when the idea of Factorio was thought up from a mix of modded Minecraft and Starcraft concepts. All this was iteration, usually in remarkably small steps. No human I have ever seen can "design" such things by drawing up stuff on a whiteboard and polling anyone's opinions in 1990. It is a process of guided evolution, enhanced by our ability to reason, but in no way some grand plan articulated beforehand.
48:10 Ironic how he talks for an hour about designing programs to accomodate users but then makes the exact mistakes he talks about by blindly assuming every developer can just sit idly for 90 minutes and listen to office politics. Forcing someone like me to sit in a long meeting is a lose-lose situation - my brain will shut off after 15 minutes and I won't be able to contribute anything while feeling exhausted for the rest of the day.
But we still see it at our level too. i.e. tesla using touchscreens in cars. Same issue, but people think it's cool, until you have an accident because you have to look at the screen to perform the action. Though we should keep in mind the issue itself wasn't the touchscreen itself, it was that it failed 20% of the time. No mention of what those failures were.
@@allannielsen4752 Agree. I don’t like touch screen in cars. I still prefer mechanical buttons and it is very easy to operate. I just can’t stand 30FPS animations.
Lol, how is muscle memory going to help with with things like GPS, waypoints, seeing the battlefield etc are obvious candidates for a touchscreen. If you want to see your friendly troops and detected enemy troops and known AA emplacements, you're going to want a large touch screen
One bad design from Microsoft? hahah if only one. I work for Microsoft, and everything is designed by default bad. Because as the first slide shows, they go with guts and strong opinions. 0 architecture, 0 testing, 0 Ci/CD
I always thought touchscreen was a flawed interface design choice. It may be a necessity, but that doesn't mean more time and brainpower should go into supporting it than into seeking a viable alternative. "Good enough" should not be the end goal. Human Interface should be covered in software development courses, because so many times developers don't bother to learn how their applications are used, so they have no idea how a given feature will fit into the workflow. So they crank out a proof of concept implementation and proudly call it done, then are bewildered by the end-user complaints. The only course I took that even approached the topic was by an industry professional who seemed to believe that you couldn't predict how users function, so you just have to listen to them. Well, maybe learn enough about their tasks to get into their perspective, so you can first ask the right questions, and then get your hands into the interface yourself and test!
@@enkephalin07 Indeed. But I have flown virtual planes in flight sims. Things like GPS, waypoints, seeing the battlefield etc are obvious candidates for a touchscreen. If you want to see your friendly troops and detected enemy troops and known AA emplacements, you're going to want a large touch screen. Can't you see that?
Wouldn't it be cool… Wifi Wouldn't it be cool… iPhone Wouldn't it be cool… Netflix Wouldn't it be cool… Clippy? Oh! I get why he'd say it's a bad design approach, now.
@32:28 hearing him say this is rather insane and actually infuriating. Because developers are rarely ever given the authority to decide on design. There is no "attitude among developers" that they can stop once they "get it to work" .
The business is the one deciding "hey it didn't work last month and now it does, and our roadmap says to move on to these next features so that's what we're doing". Among developers, all I've seen is frustration that they are not permitted to do things well and are forced to leave half-finished work. Famously Brendan Eich's JavaScript prototype taken by the business and shipped before he ever said it was ready.
Billy Hollis saying things like this makes me suspect he is a coward, focusing on blaming developers for the behavior of executives, because that aligns well with his paycheck: the businesses he works for also think their bad outcomes are the fault of developers, and I'm sure are happy to hire a consultant who seems to have the same attitude.
In my experience, all of these problems from "wall of data grids" to "my job is done" are decisions of executives, not of developers. Developers always push back as much as possible, but they ultimately do not have the power here. If the executive wants their stupid page-flipping animation, they will get it. And they refuse to listen to developers who tell them it's a bad idea.
yup,
I've worked in places where common discussion-topics among the developers include:
- are we going to sneak this piece of actual quality past management?
- this issue is known and solvable, are we going to apply the above to sneak past a band-aid, or keep our mouths shut and hope the failure *we've warned them about* makes a sufficient impact, we'll actually be allowed the time to fix things properly
- don't *ever* inform management how fast you could put in a band-aid solution, should the day come this becomes a 'production is completely shut down, customers can't do anything, call people out of bed if you need to, this needs to get fixed NOW', in *any* situation other than 'on that get called out of bed phone call' they will either use it as an excuse to not fix something, or make that the estimate of the 'proper' solution, to be released next week/month.
- and of course the common: if you have an estimate, double it. In the (uncommon) scenario no disturbances come up, this can be used for much needed cleanup.
So true, by the time it gets to us (developers) the design has already been done. It's our job to implement and make it performant. I personally think it's why we have the millions of abstractions, the layers, the microservices, etc. I believe it is as a direct result so that we can mitigate bad design quickly, and get the next bad decision out in the wild. And ultimately, the whole thing is driven by the next dollar, the shareholder, etc, not the end-user.
having worked with wall of data grids, they aren't as intuitive to learn, but for someone who spent the effort learning them, boy can they be efficient.
I have a feeling that the F35 issue is more complicated than the speaker makes it seem like, but in general i really liked this session and felt like he spelled out a lot of things that i've experienced in software development before.
Any expectation that one can sufficiently analyze all the causal elements of a trillion dollar failure in a single live stage presentation is materially ludicrous.
@@JeremyAndersonBoise that's a long way of saying "you're stupid". i didn't have the expectation that they explained the case in all the details, the other extreme is just as bad. i guess i should have said "i got a strong feeling that the speaker oversimplified the case a lot more than needed and it felt one-sided" but if you're expecting that from me i think your expectation that one can sufficiently express their thoughts and feelings to your levels of nuance in a single youtube comment is materially ludicrous.
@@FunctionGermany😂 I would feel flattered.
The moment I read touchscreen in a fighter jet I felt like it's doomed to failure. There's a reason switches work and this is a situation where you have to be sure what you just did. I can understand my phone (really just a portable web browser) can be touch. I'm asking it to be a general purpose computer. I guess also not to mention how horrible modern web site design is with its lack of affordances.
He is repeating a myth, spread by russian probaganda. He showcases that he is a person who does not check is sources, and instead just blindly repeats what he heard others say.
Mild objection to the idea around 32:40 that this is developers' fault. More often, management won't let them spend time on doing a good job so they have to bang something basically functional together quickly, or it's given to the guy who's available or cheap who doesn't know how to do anything better.
Was thinking the same.
Though it doesn't change the core point of 'it is theoretically possible to work with this' == 'good enough' is a bad practice, the good alternative just costs more time. (which will earn itself back, now if only sales was capable enough of convincing the client of this fact)
The decision to 'meet more deadlines' over 'do things well' increase exponentially with additional layers of 'squeeze more deadlines out of those below you' in the company.
And indeed it is pretty rare that it comes from a decent developer, not pressured either by a very hands-on manager trying to ram through every project asap, or by an aggregated-deadline-structure intended to do the exact same but giving the manager in question 'plausible deniability' on any single project messed up by this 'move fast and blame others for what it breaks'-approach
I was more or less the one guy who cared about UX at (company name undisclosed).
At some point scrum was introduced. Nobody could figure out how UX fit into the rituals, and perhaps management were able to predict that I would have pointed out that scrum wasn't working (because yeah, I was pretty vocal when voicing complaints, especially when I was right), so I got bumped out of the scrum teams to do DevX stuff instead.
Eventually management decided that DevX wasn't worth any spend either, so now I'm out of work.
The glib interpretation of this flow is that to have job security, you're better off just doing the grunt work of implementing the features with zero regard for UX or true quality of any sort.
"Design is not how it looks, it’s how it works" - Steve Jobs
Thing with the F-35 is that the airplane is mind bogglingly ambitious and forward thinking. Unlike literally any other plane that exists.
The trillion dollars number is also extremely misleading, its a total lifetime cost of procurement. The actual cost per plane is nearly half that of the F-22, to my knowledge the F-35 is the least expensive stealth aircraft in the world, and the most capable by far.
The title of the article itself is misleading, the next gen plane is not a replacement for the f35, but rather a replacement for a whole class of roles which the f35 is overkill for. The existence of the f16 doesnt make the f15 a failure. The f15 is the most successful air superiority fighter in history. But its overkill for all the uses the f16 gets put to. Same with the f35.
The point about touch screens in a cockpits is something, but the actual ux challenges of a platform like the f35 are bounded by an absurd amount of complexity that flip switches cannot handle. My understanding is that a lot of the cockput experience problems were due to the helmet system taking a long time to reach maturity, and this meant the contextual flow was jank in the early phases of rollout.
Screens can context switch and toggles cannot. Physical controls are great however, so there is something to the case, but to actually make a point about good design, you do need to take the constrainst seriously, and nothing serious was addressed in this talk imo.
th-cam.com/video/CH8o9DIIXqI/w-d-xo.html
Link to a good explainer video on the f35
A key point remains,
there was a bunch of 'wouldn't it be cool if' over 'what does the user actually want' design involved...
not all of it sure, but definitely some and as 'an intention-grabbing example' it did exactly what it should be doing.
case in point:
"contextual flow was jank in the early phases of rollout"
in 'mid/late phases of user-testing' this would be acceptable.
the 'usual' approach of 'if your version ends in .0 expect problems that will be resolved by the next update' is one of those things that should not happen here.
the 'detail', the cost of an early-in-version-lifecycle-bug in this use-case is death, should have been one of those 'don't blindly copy over requirements from unrelated scenarios'-triggers.
@@user-jn4sw3iw4hThe plane isn't actually a failure though so the key point is fundamentally wrong.
@@backgammonbacon No,
Neither the *key* point of 'if the 'wouldn't it be cool if'- argument is your only reason for a design-choice is probably a bad design-choice.'
Nor the 'here's an attention-grabber example, to get/hold the audience's attention' - placement in the presentation
Is in any way invalidated by any 'late testing-phase'-resolution to the 'they didn't bother to check what feedback the users required before they started implementation'-problem.
A 'less than perfect implementation'? Probably.
A 'complete failure invalidating all key aspects'? No, absolutely not.
(Whether those last 2 sentences describe 'the plane' or 'the presentation', i'll leave open for interpretation)
I saw this talk and immediately recognised one of the patterns that we also did at my previous employer (and, surprise, it failed miserably). Thank you for the talk! I enjoyed this recipe for micro service disaster!
yes, i know! the good old "almost no design" pattern! although my favourite is the "alibi design" pattern. there is some design, it may even be okay, but after two weeks the software has no resemblance whatsoever do it. but: "hey, we have a design!"
oh man, i'm so glad i'm not working in software development anymore
I love these types of talks but feel many of them fall short of answering or addressing the root issues. For example, the Rolling Stones didn't create music using Agile methodologies. No one told Keith Richards he had 2 weeks to refine a melody. Great artists didn't give a fuck if people were consulted or focus groups formed. IT JUST HAPPENED!!! I'm not dismissing the speaker's content. It was decent. But highly creative people just make things happen and then the world takes notice after the fact.
Oh... and btw.... my wife is an ER physician. She and most all doctors will tell you their available software sucks beyond belief. So this notion that he "fixed" all problems related to radiology I find highly skeptical.
If I learned one thing working for the last 4 years as a software engineer, it's never our or even the design team's fault for the most part (like 95%). It's always the management. Always. Those people demand unrealistic feature sets to release based on their quarterly analysis and keep navigating ship from one feature to the next without any weightage given to the engineering/design team's opinion until the ship is wrecked. Then they look for another team trying to replace us.
Hey but, it's your job as an engineer or designer to guide them better, don't you know? It's not. It's not my job to do politics wasting my team in useless meetings and scrum. If the management doesn't get it in one time, they should resign after the ship wreck rather than telling others what to do. And why are you even there to begin with if you have no domain knowledge? Let it be just us, designers, and users. Get out of the way!
Rant over.
"Modern UI" design SUCKS it is incredibly wasteful of space and too much focus on small screens.
I almost lost a finger because a touch screen didn't respond when I touched a button to stop a motor. I hit the button again, because, well I really wanted that motor off. Ok, it worked the second time-now to get the job done. Unfortunately, the first touch *did* work, but updating the UI to reflect that the motor was off took about ten seconds. By the time the second touch registered, turning the motor back on, I had by hand in the machine. That was 15 years ago.
A few months ago at my current job, there was a software update that changed the UI from a "safe" version (on button is on a different spot on the screen than the off button) to the arrangement that caused my accident (on and off buttons are the same spot, just colored red and green - which BTW, causes another problem for color blind people)
Nothing beats a real switch.
We can see this by the near complete lack of interest in "smart bulbs".
Just imagining all bulbs in my house were "smart" bulbs. Every time I enter the basement or bathroom shouting "light on" or "light off" instead of flipping a switch. No @@myne00
my company first made an UI for a browser and then decided to limit the support to a single one browser. now we have the worst of both worlds - no hotkeys, no windows, no drag-n-drop, and no easy access with whatever browser user already have in the system
I wouldn't want to be sitting in the cockpit and see the pop up "Are you sure you want to deploy antimissile decoy now?"
The thing I really dislike about this talk is how he assumes developers have to be the one to do he UX
Developers should not and must not be tasked with doing ux.
"[...] developers tend to be rather code Centric [...] and sometimes they [...] blow off all the other things that they really ought to do"
Is a fundamentally flawed statement. Developers are hired to write code. Any company that tasks developers with doing more work than just writing code will have the employees complain about having to do too much work.
If you want good architecture, hire a code architect. If you want good testing, hire QA. If you want good communication to users, hire support and if you want good UX, then hire UX.
Don't make a developer do it and then be angry when they can't do it!
Data grids also happen because the business did a proof of concept in Excel, and the devs copy it
Excel is the world's most successful functional programming language. It's kind of weird that all implementations of the interpreter that I can find have data grid GUIs rather than, say, a console interface, but Excel is still the most widely used functional programming language.
Texan-sounding fellow starts with what is essentially, "The government can't do anything correctly, so what else would we expect?" Cue my eye-roll. Also, he is comparing defense contractors designing specialized equipment in essentially a captive market with software development for a much more competitive field where users can vote with their feet.
The rest of the talk seems reasonable, though.
Ca. 21:07 on Hick's Law shows a reasonable UI for selecting among a set of nested choices, so long as the choices make sense to the intended user. It is complicated, but it is more useable than the flat hierarchy/icon-based UI shown on the previous slide.
Ah yes, Windows 8... It was bad. Plain bad. But at least there was the idea behind it that it was designed for tablets. But they slapped that moronic interface on Server 2012. BECAUSE I LOVE ADMINISTRATING SERVERS FROM A BLOODY TABLET.
16:18 Everyone is always shitting on Swing for "not having animation", but we were doing animations in Swing, and even AWT. It's not that it can't be done, it's just that it requires writing bits of the framework, or these days, finding a library which does the repetitive bits.
I have recently learned how to do animation in Compose, and aside from it being very declarative in Compose, I feel like the way it worked in hand-written animation code was sometimes simpler to understand when maintaining the code. The way animation code works in Compose really feels like programmatically building up collections of key frames, which I guess is nice and declarative, but sometimes you just wish you could bring up an animation editor, edit the damn thing as a resource, and save the file, like in Unity or Godot Editor.
Maybe it's just me though, because I've been doing animations since the AWT days.
The as/400 app reminded me of a rewrite of a vt100 terminal in Java the users really didn't like it and it was slower than their "dos" like terminal
Now with Java on serverless slowlyness costs the company money and they allow to write the apps correctly and fast. My own desktop and android Java Apps were always snappy.
Great talk
“Wouldn’t it be cool if” Is why a lot of us became programmers. You sound like the fun police. I do completely agree that user convenience trumps developer convenience.
There's a key point missing though:
He's not saying “Wouldn’t it be cool if” is a reason not to do something.
It's the reason to ask/check for other reasons to. (assuming costs are involved)
“Wouldn’t it be cool if” is an invaluable tool in either the brainstorming phase, or for picking hobbies. (how many of us became programmers)
It should however not be considered enough for the refinement phase, or for picking a career.
That's where 'it would be a good selling-point or now that I've actually thought about it for a moment here are 5 more benefits (with the following fine-tune potential)' or 'given the trivial scalability, testability and reversibility (in 90+% of cases) it's such a universal solution -> it provides variation, job-security and decent pay' comes in
My interpretation of what he saying is “wouldn’t it be cool if” is antithetical to good design choices. And I personally think it is a key first step in all designs. Then he bashes on 3-D explosions that I’m sure tons of kids love to play with, and it even makes him smile while he’s doing the talk, he says no one needed it and it’s a complete waste of development dollars. That offends me. It’s the little things in life that make it worth living.
@@Diamonddrake that "my interpretation" might be key here.
I agree the introduction-story of the 'wouldn't it be cool as red flag'-point could have been clearer on this, but in the closing summary he unambigously states it's 'doing the thing *only* because it's cool' that's the problem.
As for the '3d explosion'-thing:
- is it an issue it was created at all? No
- might it be an indication of an unhealthy priority that, the admittedly entertaining little toy, was the *key* presentation, at a major event, from a 'both top 3 OS-vendor and top 3 corporate cloud service-provider' size party? Probably.
The WORST UI decision ever made was making buttons "dynamic" and making tabs resize instead of statically sized but multi-row. On Windows XP you could mentally queue up commands to perform because you already knew after you clicked something where the next button you would click was going to be. You could use the OS faster than it could process your commands because it was static like that. Then they made everything animated and dynamically positioned, so now you have to look and see where the next button you're going to click is. Incalculable wasted time over every user of Windows.
Usability testing is IMHO the best way to quickly see if a design works or not. What the heck is a "world class designer/architect"?
There's another side of the "cool" coin - great UX ideas getting shot down because they may seem fancy or extraneous when sometimes they're the details that breathe life into an application (versus selling software that just "tries to get a passing grade").
Seems like this is kind of a long-winded "start with why" point being made.
I worked on contract for MS during the Windows 8 era. Billy is so unironically cool. Also correct.
The world would be doomed if someone invented a touchscreen “red button”
5:47 "Put yourself in the mind of the user"
That won't work. The people making the decision truly believe their solution is what the final user needs/wants. Any attempt to think like the users will just validate their decisions.
There are real design concerns physical switches vs touchscreen, as400/sales example, and ms toy explosion videos,, but he keeps stepping outside his area of expertise and saying seemingly stupid or inappropriate things that taint the legitimate message. Like when he dumps on the government he isn't professional, appropriate, or logical, and he does it with "I imagine they had a meeting and said..." a lot instead of real evidence and ignores that governments must operate differently than business because they do things that businesses can't or won't. There are real complaints to be made about the F-35, and this person has only a superficial grasp on them for some who spent so long talking about it. Then this talk about thinking about design and he leaves his notes and email up instead of designing his workflow and have it ready with a single alt-tab or at least minimizing the other windows. Maybe this guy was serious once upon a time, but he seems like a clown now.
Thank you for saving me from writing an entire comment as you covered my objections with the talk eloquently. Showing disdain for all government projects when, like in private industry, some succeed and some don't. Talking about Microsoft is if it's some clown college that didn't invent, I don't know, the azure cloud.
We don't do computers with touch keyboards. Mechanical tactile feedback is important for efficiency when your eye focus lay somewhere else.
On mobile phones it works because you can look at the keyboard while typing with two thumbs.
On a computer my eyes are on the screen, the keyboard needs to work blind, and the feedback from a keyboard is more valuable than the screen prompt.
Especially on a fighter jet you need to know if you hit the button without looking.
I don't follow the question at 1:01:27. I gather it's about why his website is the way it is. It looks fine to me. The problem I have with it is that it wasn't served up over HTTPS. Blog is shorter than I would have expected. There's only one article and it's a 6-year-old white paper.
This is only 6 months old but I can't find the "edit more" in photos anymore. I immediately tried to add explosions to any videos I got at hand. Edit: Oh yeah I have W10 so it might be a W11 feature, bummers.
I find the first mistake of the design team was not interviewing repeatedly fighter jet pilots throughout the development: first to gather their opinions about what they want from the jet and the interface, and then pass the later ideas through them, getting feedback at every point of the development pretty much. You can't sell a product as a black box that you know nothing about, about the end user or understand any of it yourself.
Regarding the tactile buttons vs touch screen, when I had my Nokia cellphone back in the day, as a teenager I could ride a bike while texting with the phone in my pocket. Today that's a guaranteed incomprehensible message and potentially traffic accident. Most of the time even when texting in an unstressed situation I find myself cursing because apparently my thumb is not touch screen compatible, either not activating it or hitting wrong keys repeatedly. Pressing small interface buttons on a smartphone is impossible to do right the first attempt. And I bet everyone has noticed that difference. What about touch buttons powering on and off with just an accidental swipe and even sometimes from random objects that aren't your hand. Now does that experience sound like the choice for a situation where reaction time and life/death are the top priorities?
8 times the code also for some reason rings an alarm bell to me. I understand more complicated stuff requires more complicated code (at least sometimes), but on a fighter jet like application my philosophy would be that you want to minimize the opportunities to fail, which would to me mean minimizing the amount of code (that inevitably needs to be analysed and debugged for possible issues that _can't_ happen).
Oh how I've hated all the apps that moved to browser interface. It is so often just a compromise that you start to shiver just because you hear it's browser based, before even seeing it. I understand in a thing like SAP where you'd do remote access for licenses or whatever so it's just practical to do it that way, but browser environment is not the comfortable environment to use an app. Like first of all it usually robs all the key binds and doesn't follow your normal quick key patterns.
I love the "red flag phrase: the users just need more training on that". I've been telling anyone who agrees to listen in most of the circumstances it's relevant with, that at least from my engineering perspective, if the user has any ambiguity, any confusion, if it's not intuitive right from the glance how they get their peak priority done in the thing, it's not good design. It's particularly obvious to me because when I open some interface like webpage or instructions or something, I'm not opening them because I want to learn it through and through by heart. I'm usually at least slightly annoyed, frustrated, upset. I have a problem and I want to get a thing done. That's when I'm really unable to read a long sentence or a paragraph of sentences, browse through menus and whatnot. It has to be obvious or I'm gonna miss it. And if I in fact am in a more relaxed state, have time to look it up etc, I'm just left thinking "well that's shitty, I've spent more time trying to figure out where to find that one little thing that should probably be the one of the most likely user need, or at least more critical one if not the most common one, that it should be glaringly obvious where to find this information". Usually webpages and interfaces are full of unnecessary or once in a blue moon stuff that for some reason just was deemed to be one of the more core features or options.
Basically it's like being in a school that has maths and physics (or coding). Most of the people are not too dumb to understand how to solve the problem. Most of the people are just really frustrated about the material that they can't figure out how to apply to their specific problem. Because it's confusing and the rules aren't obvious. I feel like coding is one of those things that particularly shines on this. There's all the material, either documentation or the course material showing how to do stuff. Then you engage in a task you need to build and you figured out the design patterns etc but you spend forever trying to figure out how in the hell the syntax and interface for that thing works and none of the material or documentation has that. Rust is known for having pretty good examples and material and even rustlings, a training system or whatever. But then there's every once in a while an exercise where you think you're doing just a normal application, putting things together and using things nested or something. Only to finally find out that for one of the features you need to do a syntax of {} or something where none of the existing material you find online shows using those instead of () for example with the feature and unless you use those, it doesn't work. Everything else in your whole code is fine but you're ready to give up because nothing makes it work. Except that thing where everything else uses (), heck, even that feature uses () normally but now there's a hidden rule that you need to use a different thing. It's not even a good learning event because you just discovered a magical artefact that is vowen to be kept secret from everybody and it's a hidden extra that you have to find for this very specific unique case.
For some reason doing this kind of design is close to my heart where you approach a thing saying "no, that does not work, people are too stupid to use that, no you can't do that because it allows people to make mistakes, no that doesn't pass because it requires learning and knowing about this and that, no that doesn't fly because it's not explained in unambiguous and short, accurate terms" and try to iterate to something where you just don't need a manual to use it even if you know nothing about it. I think some manufacturing or product design philosophies follow that somewhat, or tell you to make intuitive user interfaces that either take advantage of a concept that everyone is familiar with, or if a design is similar to an existing familiar design but doesn't work that way, you should change it enough to not make the user not think of that application. That's what I dig. Trying to make everyone's life easier.
In fact Microsoft was sort of right about "folders are where apps go to die". Windows needs folders to hide all the useless crap it ships with so you can have like a start meny or toolbar with less than 10 of your favourite apps.
48:10
A 90 minute meeting on a new feature to be built, and in less than 10 minutes they start talking about the actual functionality to the point that dreaming off into code becomes an option?
Where can I join?
2:50 Well, that is 1.5x of YEARLY GDP of Poland, not 1.5x Polands.
20:54 😳 This looks just like the screen that pops up on macos when you hit the Applications button. I hate it every day.
56:37 ... "A spreadsheet"???? Isn't that the same as a data grid. :D :D :D :D :D
Yes, but this is probably the correct use case for it.
He didn't say that one should never use datagrids. You missed the point.
@@TheVincent0268 You missed my "wall of laughing faces"
32:22 you could’ve made this even better just by removing “NOT A DRILL”, replacing it with “DANGER!!” And changing the color of all the text to red
I always get mad at the windows 8 example. Because it was really nice. The issue was not UX but the inconsistency and the fact that half of the things were Windows 7 and the other some new places moved randomly. I didn't see it so cool, but the tiles were quite useful and adding productivity. And what's up with the no folders? I always had folders or was it in a very early version for insiders?
😢
My main beef with win 8 was that in 8.0 control panel/settings didn't show up in regular search, you had to go to a separate tab. 8.1 made it one universal search, and I really enjoyed it. Now, in win11, I miss the tiles - in fact, I miss getting to decide for myself how many apps I wanted to have visible on my start screen/menu. I preferred 8/10s icons/tiles.
i think that part of the problem was that the screen completely changed when you entered the start menu. it confused users because the context of what they were doing is completely lost, and it was not immediately clear how to get back for example
@@xybersurfer Not to mention the focus on "everything maximized to the whole screen" (start menu included). It makes sense for 99% of the use cases, and much of the rest is handled by putting two "full screen" windows side by side... but it fuelled the whole "Windows no longer has (overlapping) windows" fun :D
It doesn't help that it clearly was a design that was honed by small handheld devices with touchscreens... and by that point, those were _heavily_ associated with horrible anti-consumer practices.
@@LuaanTi The issue was:
The 'lets make 'windows for phones or tablets' the only way to do things now, even on _actual computers_ '-design decision.
Rightfully backtracked into 'let's make it the default', within a week.
Maybe the massive rush on 'and how do we switch away from the default' was an overreaction-driven example of 'users with status-quo-bias'. I'd go with probably, but it was reasonably predictable.
Now if only the release-process had even a fraction of the design-attention they (claimed to have) put in the ''windows for phones' is a great OS for desktops, trust us'-itself
At around "46:20" you say "It is true that users can not......" can I add "that it is also also true that many companies don't have people with different perception skills like dislexcia on there UI design teams or as a dislexic my self they seem not to because so many new UI's demonstrate that".
thanks for great insights into the design world! :)
If a touchscreen is added to a device, it will eventually drive out other input mechanisms.
Not entirely coolness - anything with moving parts is expensive.
In the given example of an F-35 I call BS for the expensive part. It was probably about a 1000 times more expensive to write the code, debug the code, get the code integrated, have a user interface on a touch screen, etc. than to add a manual switch. The development and integration cost is only recovered if you sell a few hundred thousand of the items you're putting that into.
@@guidon.5413 Why is noone able to think about it for a minute. How are you going to implement a GPS map with manual switches? Of course you need a touch screen!
@@johnflux1 - no, you don't. You might not have thought this through. A) You don't need a touch screen for a GPS map. If you think you need this, think again. GPS devices without touch have been around forever. B) You might want to do a touch screen for one specific purpose and nothing else.
The person not thinking is you.
@@guidon.5413 "You don't need" - lol, so you're arguing for having a touch screen for some particular purpose, so the hardware is all already there, but then actively prevent them from using it to do basic pan and zoom? Is that really what you're pushing for?
He compares F35 cost to Poland's ANNUAL GDP. That's what's called a 'flow variable'. Has time in the denominator. F35 cost is not. It's like saying 'the Burj Khalifa is 828m high - that's more than twice the speed of sound!'
If you were doing math then you're correct, but he's just using it as a language tool to aid understanding. He is essentially saying "this project cost more than the GDP of a reasonably sized country" to give a sense of scale. GDP is by its definition annual.
TL;DR? don't code it like that but it is reasonable language use.
@@alan_davis you are entitled to your opinion but I disagree, it's not maths, it's something you are taught in physics first year, and if you think *really* hard you will understand that the most famous equation of C20 e=mc2 was easier than it looks, because the units on both sides must match. Also in ft comments this topic was raised and highly recommended in the comments section.
I think id have left this one before the half-way point... while it's got an important underlying message (UX is important / ask "why?") it was long winded and not very engaging.
You could do this talk in 20 minutes with 2-3 key examples (none of the ones from this) rather than deciding to claim the F35 project cost was based on the choice of a touchscreen... which I suspect lost most rational viewers immediately.
The topic is interesting but I would expect a bit better slides from a UI design expert. They slides, full of text, in the same font and appearing at once are very distracting.
Windows 7 appeared so ugly to me, I was happy with Windows 8 and then 8.1.
It was far from perfect, some design mistakes not fixed yet, but for me it signified the start of change from unnecessarily fiddly and decorated interfaces.
The transition from WinForms to XAML-based UIs had to happen, but the new UI language wasn't figured out yet.
I think it is comparable to the evolution of Android UI that was happening during approximately same time period - a lot of trial and error was happening there during first few major versions.
What is wrong with you?
This is possibly the least well argued case I have ever seen on NDC, and I've seen some shockers.
Now, I'm not a huge fan of the F35. But I would never premise my whole argument on the completely made up, uninformed guess that the F35 designers based their design on the phrase "wouldn't it be cool if", and then draw a whole bunch of conclusions from that.
I would never be so daft as to think that I could draw an analogy between "not using the best jet available all the time" and "not using the best software available all the time", because of course you don't use the jet with high operating cost for operations that don't need its capabilities. That sort of operating cost per flight hours problem simply does not apply to software. The analogy is boneheaded.
Why would I keep listening to a presenter who introduces their argument with such stupidity?
I'm willing to assume he's knowledgeable about his area of expertise, and not knowledgeable about defence procurement or finding suitable analogies for conference talks, and give him some benefit of the doubt
It’s difficult to have it both ways. He crafted a speech/presentation choosing a specific analogy to best convey his point. If he’s not an expert in that domain or doesn’t understand the analogy he’s creating, it says something about his expertise as a presenter and should call into question his assertions.
I’ve found Billy Hollis a bit lacking in his keynote from ndc minnesota 2022 as well. The talk was pretty hand wavy and book he recommended as gospel to architecture design, “righting software”, was not a great read either.
It could be different strokes and all that, but I share the opinion that Billy’s talks seem to be more style over substance.
Reminds me of the story about the R&D project to design a combined Shovel/Bayonet. "Wouldn't it be cool if..." is just a much catchier title than "why you should always have reasonable expectations about anything new".
I found a lot of his talk in general very hand-wavie and rambling. I honestly gave up at the 33 minute mark.
It was about twice as long as it needed to be, but damn some of these comments are really harsh. Most people are not as competent in public presentations as they are at their actual careers, so I'm willing to cut the guy some slack. I know I'm much better at implementing advanced cryptography from maths papers than I am at presenting those concepts to non-autists, for example.
it's kinda hilarious that they even imagined touchscreens in a fighter plane. it's even hilariouser they actually implemented it. although it's also kinda depressing because it shows how little contact people have with reality.
That sounds like you've got no contact with reality. Things like GPS, waypoints, seeing the battlefield etc are obvious candidates for a touchscreen. If you want to see your friendly troops and detected enemy troops and known AA emplacements, you're going to want a large touch screen. Can't you see that?
@@johnflux1 sure, but not for plane controls. Also, it's kinda sad you can't formulate your comment in a more civilized way.
@@f.d.3289 It's sad that you can't think. You said it's hilarious that they even imagined touchscreens in a fighterplane, but now seem to admit that they are needed. And look at the photo - there's still traditional plane controls right there. How did you miss that? What plane control do you think they are putting in the touchscreen that shouldn't be there in your expert opinion?
You're kinda sweet you know
Clever Hans seems to keep on being relevant, especially in the area of machine learning. There was one attempt to train a system to identify malignant skin growths... it would flag anything with a ruler next to it.
How is that a design failure? That's the sort of mistake a developer makes, then you go "opps", and crop out the ruler and continue. The sort of thing that sets you back a few days.
@@johnflux1 My point was actually that Hans is applicable to machine learning (in addition to design) in terms of reacting to / training on the wrong thing. It's "why does this horse keep on being relevant to different fields" situation. And I believe the development timeline was somewhat longer when that happened
Hahaha, who puts touch screens in a fighter jet? I nearly drive off the road when trying to operate Spotify on my phone lol
Yeah, that's not because of the touch screen. That's because you're not paying enough attention to your driving. If your phone was operated by a bunch of switches and dials it wouldn't make your driving better!
And how do you think they should show GPS and maps without a touchscreen? Lol.
@@johnflux1 Buttons and dials!
@@DrSpooglemon Honestly not sure if you're serious. Too many people seem to be arguing against a touchscreen.
@@johnflux1 --- EDIT ---
I just realised you might be asking how they would actually see the maps without a screen and are using the term 'touchscreen' as a general term for a screen. We are talking specifically about the touch interface not the fact of having a screen. Fighter jet pilots have information cast onto a lens that sits over one eye. They develop the ability to focus one eye on that while the other eye looks where they are going. Quite interesting how the human brain can do that. But even if they required a separate screen for maps and other information the idea of them taking their hand off the controls to scroll over a map with on a touchscreen - with all of the inherent difficulties associated with such an interface - while having to make split second decisions at supersonic speeds is absurd to me.
--- END ---
You honestly think that there is no other way to access maps and GPS than with a touchscreen? Are you under 25? You must be.
In any case, anyone with any kind of imagination can come up with a better way for someone to access important information while operating a fighter jet with split second timing than a touch screen. Buttons and dials don't get moody if your fingers aren't moist enough at the tips, they don't stop working because there is too much gunk on them from you fingers, you can operate them WITH gloves on and it is more difficult to press the wrong button when you have actual physical buttons rather than virtual ones on a touchscreen.
Netflix could learn from Hick's law :D
I strongly disagree on the idea that iteration does not lead to good design. The opposite is true: outside of trivial problems, iteration is the ONLY source of good designs.
Let's look at the best of the best, the iconic strategy game UIs: Starcraft 2 (LotV), Supreme Commander (FAF), and Factorio (1.1). These are amazing demonstrations of visually doing complex things fast.
We know the history of these UIs. We can trace back step by step until we find WarCraft developers copying Dune 2 around 30 years ago, even using its graphics as dummies. Innovations and changes were introduced slowly, often individually, over decades. Never was there a prior, grand design on this scale! Never in their wildest dreams did anyone designing Dune 2 think "Alt-Shift-4 should remove the current selection from other hotkey groups and add it to #4." Dune 2 couldn't even select more than one unit!
Playing Factorio, I now build belt arrays like this: Shift-6 swaps my primary hotkey row to the contents of a blueprint book that uses global grid alignment to... you get the point. Early Factorio had no blueprints, only one hotbar... and that thing was actually a Minecraft-style inventory bar! Neither the entire interaction now, nor even its reasoning made any sense when the idea of Factorio was thought up from a mix of modded Minecraft and Starcraft concepts.
All this was iteration, usually in remarkably small steps. No human I have ever seen can "design" such things by drawing up stuff on a whiteboard and polling anyone's opinions in 1990. It is a process of guided evolution, enhanced by our ability to reason, but in no way some grand plan articulated beforehand.
The "clever Hans" sounds a bit like the current LLM so called "AI"
If you're bombing along in turbulence, it's even difficult to hit a button.
48:10 Ironic how he talks for an hour about designing programs to accomodate users but then makes the exact mistakes he talks about by blindly assuming every developer can just sit idly for 90 minutes and listen to office politics. Forcing someone like me to sit in a long meeting is a lose-lose situation - my brain will shut off after 15 minutes and I won't be able to contribute anything while feeling exhausted for the rest of the day.
I am an intermediate backend developer and I know I would never put a touch screen panel on freaking fighter plane where muscle memory is the keu
But we still see it at our level too. i.e. tesla using touchscreens in cars. Same issue, but people think it's cool, until you have an accident because you have to look at the screen to perform the action.
Though we should keep in mind the issue itself wasn't the touchscreen itself, it was that it failed 20% of the time. No mention of what those failures were.
@@allannielsen4752 Agree. I don’t like touch screen in cars. I still prefer mechanical buttons and it is very easy to operate. I just can’t stand 30FPS animations.
Lol, how is muscle memory going to help with with things like GPS, waypoints, seeing the battlefield etc are obvious candidates for a touchscreen. If you want to see your friendly troops and detected enemy troops and known AA emplacements, you're going to want a large touch screen
@@johnflux1 but not when you want set throttle or toggling any switches.
@@johnflux1 Saying lol don’t make your point. This fighter plane has failed and this is one of the reason.
One bad design from Microsoft? hahah if only one. I work for Microsoft, and everything is designed by default bad. Because as the first slide shows, they go with guts and strong opinions. 0 architecture, 0 testing, 0 Ci/CD
Please, fix Gnome.
We should all use Silverlight and XAML 😂
It's a great talk, but it's almost exactly the same as another one available on youtube for months.
I hate web "apps" so much, javascript is a punishment for computer sins
Wouldn't be cool if we bring back common sense in software development 😂
I always thought touchscreen was a flawed interface design choice. It may be a necessity, but that doesn't mean more time and brainpower should go into supporting it than into seeking a viable alternative. "Good enough" should not be the end goal.
Human Interface should be covered in software development courses, because so many times developers don't bother to learn how their applications are used, so they have no idea how a given feature will fit into the workflow. So they crank out a proof of concept implementation and proudly call it done, then are bewildered by the end-user complaints. The only course I took that even approached the topic was by an industry professional who seemed to believe that you couldn't predict how users function, so you just have to listen to them. Well, maybe learn enough about their tasks to get into their perspective, so you can first ask the right questions, and then get your hands into the interface yourself and test!
A touchscreen is an obvious choice. When you are in your car, do you want to program your GPS without a touchscreen?
@@johnflux1 Human language interface. I actually don't use a GPS in my car though, only on foot.
@@enkephalin07 You think pilots should have to program GPS etc with no screen and just by talking to it?
@@johnflux1 I suspect neither of us actually know what it's like to pilot.
@@enkephalin07 Indeed. But I have flown virtual planes in flight sims. Things like GPS, waypoints, seeing the battlefield etc are obvious candidates for a touchscreen. If you want to see your friendly troops and detected enemy troops and known AA emplacements, you're going to want a large touch screen. Can't you see that?
At this point i fucking hate touch screens, and the only good thing they're for are gimmicky video games and McDonalds kioksks.
This speaker fundamentally misunderstands the F-35 program and it severely wounds the credibility of the presentation.
and they did it again with win 11
Ehhhh...
Coolness can be motivating, but can't be the sole _reason_ you do something, unless you're a child.
Wouldn't it be cool… Wifi
Wouldn't it be cool… iPhone
Wouldn't it be cool… Netflix
Wouldn't it be cool… Clippy?
Oh! I get why he'd say it's a bad design approach, now.