tl;dr:“I have delivered a lot of successful engineering projects. When I start on a project, I’m now very (perhaps unreasonably) confident that I will ship it successfully. Even so, in every single one of these projects there is a period - perhaps a day, or even a week - where it feels like everything has gone wrong and the project will be a disaster. I call this the valley of engineering despair. A huge part of becoming good at running projects is anticipating and enduring this period.” Sean discusses how he tackles this phase.
tl;dr:“The principle here is something like the psychological trick door-to-door evangelists use on new converts - encouraging them to knock on doors knowing that many people will be rude, driving the converts back into the comforting arms of the church. It’s even possible to imagine AI models deliberately doing this exact thing: setting users up for failure in the real world in order to optimize time spent chatting to the model.”
tl;dr:“Personally, I feel like I get a lot of value from AI. I think many of the people who don’t feel this way are “holding it wrong”: i.e. they’re not using language models in the most helpful ways. In this post, I’m going to list a bunch of ways I regularly use AI in my day-to-day as a staff engineer.”
tl;dr:“Personally, I feel like I get a lot of value from AI. I think many of the people who don’t feel this way are “holding it wrong”: i.e. they’re not using language models in the most helpful ways. In this post, I’m going to list a bunch of ways I regularly use AI in my day-to-day as a staff engineer.”
tl;dr:“I haven’t been particularly impressed by most online content about LLMs and security. For instance, the draft OWASP content is accurate but not particularly useful. It portrays LLM security as being a wide array of different threats that you have to familiarize yourself with. Instead, I think LLM security is better thought of as flowing from a single principle. Here it is: LLMs sometimes act maliciously, so you must treat LLM output like user input.”
tl;dr:“I’m a big fan of “sharp tools”. These are tools that are powerful enough to be hugely helpful or harmful, depending on how they’re used. Most forms of direct production access are in this category: like ssh or kubectl access, a read-write prod SQL console. It’s also possible to give “dangerous advice”. Dangerous advice is dangerous because (like sharp tools) it takes competence and judgment to use well. Giving the wrong person dangerous advice is like giving the wrong person production SQL access - they might go off and do something enormously destructive with it.”
tl;dr:“I’m a big fan of “sharp tools”. These are tools that are powerful enough to be hugely helpful or harmful, depending on how they’re used. Most forms of direct production access are in this category: like ssh or kubectl access, a read-write prod SQL console. It’s also possible to give “dangerous advice”. Dangerous advice is dangerous because (like sharp tools) it takes competence and judgment to use well. Giving the wrong person dangerous advice is like giving the wrong person production SQL access - they might go off and do something enormously destructive with it.”
tl;dr:“Empathetic managers care. They are emotionally invested in their employees as human beings, and actively campaign to support their employees’ needs. Ruthless managers are there to do their job. They aren’t necessarily assholes, but they see their main role as communicating the company’s needs to their engineers and vice versa. They will almost never go out on a limb on an employee’s behalf.”
tl;dr:“In the glory days of the 2010s, tech companies were very invested in their employees’ work-life balance. Those glory days are over. Anecdotally, tech company executives are now internally directing their employees to work harder and faster, with the new threat of layoffs adding weight to that directive. Engineers are rightfully scared. What should we do?”
tl;dr:“In the glory days of the 2010s, tech companies were very invested in their employees’ work-life balance. Those glory days are over. Anecdotally, tech company executives are now internally directing their employees to work harder and faster, with the new threat of layoffs adding weight to that directive. Engineers are rightfully scared. What should we do?”