Refactoring is something that should be constantly done in a code base, for every story. As soon as people get scared about changing things the codebase is on the road to being legacy.
Been with a lot of codebases that had no unit tests at all and everyone was afraid to change anything because the QA process could take weeks to months.
The result is you have a codebase that ages like milk.
Doesn’t everybody agree with this? I really never thought of it as a hot take.
Today I removed code from a codebase that was added in 2021 and never ever used. Sadly, some people are as content to litter in their repo as they are in the woods.
Only if the code base is well tested.
Edit: always add tests when you change code that doesn’t have tests.
Dynamic typing is insane. You have to keep track of the type of absolutely everything, in your head. It’s like the assembly of type systems, except it makes your program slower instead of faster.
Nothing like trying to make sense of code you come across and all the function parameters have unhelpful names, are not primitive types, and have no type information whatsoever. Then you get to crawl through the entire thing to make sense of it.
If you don’t add comments, even rudimentary ones, or you don’t use a naming convention that accurately describes the variables or the functions, you’re a bad programmer. It doesn’t matter if you know what it does now, just wait until you need to know what it does in 6 months and you have to stop what you’re doing an decipher it.
However, engineers who rely solely on comments to explain their code, are bad at writing readable code.
Self documenting code is infinitely more valuable than comments because then code spreads with it’s use, whereas the comments stay behind.
I got roasted at my company when I first joined because my naming conventions are a little extra. That lasted for about 2 months before people started to see the difference in legibility as the code started to change.
One of the things I tell my juniors is, “this isn’t the 80s. There isn’t an 80 character line limit. The computer doesn’t benefit from your short variable names. I should be able to read most lines of code as a single non-compound sentence in English with only minor tweaks and the English sentence should be what is happening in most of those lines of code.”
Programing is a lot less important than people and team dynamics
People can always be replaced, they’re irrelevant.
Sure try to replace the one or two people that hold the whole team together. I’ve seen it a couple times, a good team disintegrates right after one or two key people leave.
Also, if you replace half the team, prepare for some major learning time whenever the next change is being made. Or after the next deployment. 🤷♂️
The code can always be rewritten, it is irrelevant.
And it will change next week anyway. 😁
Python is only good for short programs
Python is only good for short programs
Was Python designed with enterprise applications in mind?
It sounds like some developers have a Python hammer and they can only envision using that hammer to drive any kind of nail, no matter how poorly.
I mean, it’s still a very nice language. I can see someone, marveled by that, would endeavor to make bigger things with it. I just don’t feel it scales that well.
I agree. The GIL and packaging woes are a good indication that it’s range of applications isn’t as extensive as other tech stacks.
packaging woes
My own hot take is that I hear this criticism of Python a lot, but have never had anyone actually back it up when I ask for more details. And I will be very surprised to hear that it’s a worse situation than Java/TypeScript’s.
We used to have a Python guy at my work. For a lot of LITTLE ETL stuff he created Python projects. In two projects I’ve had to fix up now, he used different tooling. Both those toolings have failed me (Poetry, Conda). I ended up using our CI/CD pipeline code to run my local stuff, because I could not get those things to work.
For comparison, it took me roughly zero seconds to start working on an old Go project.
Python was built in an era where space was expensive and it was only used for small, universal scripts. In that context, having all packages be “system-wide” made sense. All the virtual env shenanigans won’t ever fix that.
In that context, having all packages be “system-wide” made sense. All the virtual env shenanigans won’t ever fix that.
Sorry, but you’ll need to explain this a little bit more to me. That’s precisely what virtual env shenanigans do - make it so that your environment isn’t referencing the system-wide packages. I can totally see that it’s a problem if your virtual env tooling fails to work as expected and you can’t activate your environment (FWIW, simply old
python -m venv venv; source venv/bin/activate
has never let me down in ~10 years of professional programming, but I do believe you when you say that Poetry and Conda have broken on you); but assuming that the tools work, the problem you’ve described completely goes away.
Check out home assistant.
I don’t mean it doesn’t work for larger projects. Just that it’s a pain to understand other’s code when you have almost no type information, making it, to me, a no go for that
Larger projects in Python (like homeassistant) tend to use type-hints and enforce them through linters. Essentially, these linters (with a well-setup IDE) turn programming in large Python projects into a very similar experience to programming a statically typed language, except that Python does not need to be compiled (and type-checked) to run it. So you can still run it before you have satisfied the linters, you just can’t commit or push or whatever (depending on project setup).
And yes, these linters and the Python type system are obviously not as good as something like a Go or Rust compiler. But then again, Python is a generalist language: it can do everything, but excels at nothing.
Go and rust are also generalist languages. Basically all main stream programming languages are and are equally as powerful (in terms of what they can do, rather than performance) as each other as they are all Turing complete. So you can emulate c in python or python in c for instance).
Anything you can do in python you can do in basically any other mainstream language. Python is better at some niches than others just like all other languages are with their own niches - and all can be used generally for anything. Python has a lot of libraries that can make it easier to do a large range of things than a lot of other languages - but really so do quite a few of the popular languages these days.
SPAs are mostly garbage, and the internet has been irreparably damaged by lazy devs chasing trends just to building simple sites with overly complicated fe frameworks.
90% of the internet actually should just be rendered server side with a bit of js for interactivity. JQuery was fine at the time, Javascript is better now and Alpinejs is actually awesome. Nowadays, REST w/HTMX and HATEOAS is the most productive, painless and enjoyable web development can get. Minimal dependencies, tiny file sizes, fast and simple.
Unless your web site needs to work offline (it probably doesn’t), or it has to manage client state for dozen/hundreds of data points (e.g. Google Maps), you don’t need a SPA. If your site only needs to track minimal state, just use a good SSR web framework (Rails, asp.net, Django, whatever).
I loathe JavaScript heavy websites, especially when used for forms. Don’t break autofill and copy/paste. And don’t complain about the format just because I pasted! Seriously, why is pasting text in the correct format triggering some JavaScript framework? It all seriously gets to me.
With that said, I really like Hotwire. HTML over the wire is bliss. Stimulus is perfect for sprinkling non-invasive JavaScript throughout an application.
Not sure if these are hot takes:
- Difficult to test == poorly designed
- Code review is overrated and often poorly executed, most things should be checked automatically (review should still be done though)
- Which programming language doesn’t matter (within reason), while amount of programming languages matters a lot
Python, and dynamically typed languages in general, are known as being great for beginners. However, I feel that while they’re fun for beginners, they should only be used if you really know what you’re doing, as the code can get messy real fast without some guard rails in place (static typing being a big one).
Disagree on this one, even though I can see where you are coming from. I first learnt programming in Java, and it gave me massive problems to understand the structure and typings. Obviously Java isn’t the most beautiful language anyways, but once I picked up python it started to click for me on how to solve problems, because I didn’t have to think about that many things. I could just go for it. Yes, my code was messy in the beginning, but I wasn’t working on any important projects. It was just for fun.
So I think learning how to solve problems is as important as writing clean code. And python really helped me with that.
I’d actually argue Python stops people learning how to solve problems.
I love teaching juniors and have done so for 10 years but I’ve noticed in the last 4-5 years since Python became the popular choice at universities Graduates aren’t learning anything about Static Types, Memory Management, Object Oriented Programming, Data Encapsulation, Composition, Service Oriented Architecture, etc…
I used to expect most graduates to have a mixed grounding in those concepts and would find excuses for them to work on a small UI projects. I would do this as it gets them used to solving a small problem and UI’s give instant feedback. As Python became dominate university teaching language the graduates aren’t spending their time learning Typescript, Angular, HTML, etc… but instead getting overwhelmed by the concept of types.
Those concepts I want them to learn were created to help make solving problems easier and each has their strengths and weaknesses but most graduates are coming through only knowing how to lay out a small amount of procedural logic using Python and really struggling to move beyond that.
My take is that no matter which language you are using, and no matter the field you work in, you will always have something to learn.
After 4 years of professional development, I rated my knowledge of C++ at 7/10. After 8 years, I rated it 4/10. After 15 years, I can confidently say 6.5/10.
Amen. I once had an interview where they asked what my skill is with .net on a scale of 1 - 10. I answered 6.5 even though at the time I had been doing it for 7 years. They looked annoyed and said they were looking for someone who was a 10. I countered with nobody is a 10, not them or even the people working on the framework itself. I didn’t pass the interview and I think this question was why.
As a hiring manager, I can understand why you didn’t get the job. I agree that it’s not a “good” question, sure, but when you’re hiring for a job where the demand is high because a lot is on the line, the last thing you’re going to do is hire someone who says their skills are “6.5/10” after almost a decade of experience. They wanted to hear how confident you were in your ability to solve problems with .NET. They didn’t want to hear “aCtUaLlY, nO oNe Is PeRfEcT.” They likely hired the person who said “gee, I feel like my skills are 10/10 after all these years of experience of problem solving. So far there hasn’t been a problem I couldn’t solve with .NET!” That gives the hiring manager way more confidence than something along the lines of “6.5/10 after almost a decade, but hire me because no one is perfect.” (I am over simplifying what you said, because this is potentially how they remembered you.)
Unfortunately, interviews for developer jobs can be a bit of a crap shoot.
They wanted to hear how confident you were in your ability to solve problems with .NET. They didn’t want to hear “aCtUaLlY, nO oNe Is PeRfEcT.”
Yeah, I mean no shit, with hindsight it’s obvious they were looking for the 10/10 answer. I was kicking myself for days afterwards because that’s the only question I felt I answered “wrong”. Tech interviews are such a shit show though that you can start to overthink things as an interviewee. Also, an important aspect of the question that I didn’t mention was they specified “1 is completely new, and 10 is working at Microsoft on the .net framework itself”. The question caught me off guard. I have literally no idea what working at Microsoft on the framework is like. In that context being a 10/10 felt like being among the most knowledgeable person of c# of all time. Could I work on the framework itself? Idk maybe, I’ve never thought about it, I don’t even know what their day to day is. I should’ve just said 10/10 though, it was a dev II position to work on a web app, it wouldn’t have been that hard.
10 is working at Microsoft on the .net framework itself.
An interesting spin. I like to imagine that you could have answered “10/10,” taken a pause, and declared that you’re leaving the interview early to apply directly to Microsoft to “work on the .net framework itself.” 🤓
dev II position to work on a web app
”we want you to tell us that you’re over qualified for the role”
Tabs are better than spaces
Tabs are literally designed for aligned indentation, and they’re configurable for clientside viewing. There is no excuse for spaces. I don’t care if your goddang function arguments line up once they spill out onto another line. You’ve got deeper problems.
Tabs are designed for tabulation (hence the name), not indentation. The side effect is that a tab’s length changes based on its position in a line, which is terrible for programming. If you use tabs in the Python REPL, it looks like this:
>>> def frobnicate_all(arr): >>> for item in arr: >>> frobnicate(item)
a tab’s length changes based on its position in a line
What does this even mean? A tab is a tab.
Tab’s don’t have multiple lengths inside a file, they all have the same length.
That’s the point of tabs.
Agile in it’s current implementation with excessive meetings wastes more time than the mistakes it tries to avoid.
Go with what works
Error messages should contain the information that caused the error. Your average Microsoft error “error 37253” is worthless to me
Keep functions or methods short. Anything longer than 20 - 50 lines is likely too long
Comment why is happening, not what
PHP is actually a really nice language to work with both for web and command line utils
Don’t over engineer, KISS. Keep It Simple Stupid
SOLID is quite okay but sometimes there are solid reasons to break those rules
MVC is a PITA in practice, avoid it when possible
Your average Microsoft error “error 37253” is worthless to me
This is a security thing. A descriptive error message is useful for troubleshooting, but an error message that is useful enough can also give away information about architecture (especially if the application uses remote resources). Instead, provide an error code and have the user contact support to look up what the error means, and support can walk the user through troubleshooting without revealing architecture info.
Another reason can be i18n/l10n: Instead of keeping translations for thousands of error messages, you just need to translate “An Error Has Occurred: {errnum}”
Those benefits both make sense, but are those really the original motivation for Microsoft designing the Blue Screen of Death this way? They sound more like retroactive justifications, especially since BSODs were around well before security and internationalization were common concerns.
Linux has something similar, kernel panics. However, with Linux you get useful information dumped on your screen.
Nothing gets logged on Linux either, just like windows and that makes sense. The kernel itself messed up and can’t trust its own memory anymore. Writing to disk may cause you to fuck up your disk, so you simply stop everything.
Still though, Linux dumps useful info whereas windows just gives you this dumb useless code.
As an embedded firmware guy for 10ish years:
C can die in a fire. It’s “simplicity” hides the emergent complexity by using it as it has nearly no compile time checks for anything and nearly no potential for sensible abstraction. It’s like walking on an infinite tight rope in fog while an earth quake is happening.
For completely different reasons: The same is true for C++ but to a far lesser extent.
Compiler checked typing is strictly superior to dynamic typing. Any criticism of it is either ignorance, only applicable to older languages or a temporarily missing feature from the current languages.
Using dynamic languages is understandable for a lot of language “external” reasons, just that I really feel like there’s no good argument for it.
It’s much easier to work with streams of untyped data in a weakly typed language.
Tools that use a GUI are just as good (if not better) than their CLI equivalents in most cases. There’s a certain kind of dev that just gets a superiority complex about using CLI stuff.
The big thing you can do from the command line is script it.