Block scoping in Python

Python uses function-level scoping, for most cases:

def f():
	x = 6
    if x > 0:
    	y = 5
    print(y)  # This works, even though `y` was declared inside the `if`

A variable declared anywhere outside a function is in the (module's) global scope, and a variable declared inside a function is in that function's scope except exceptions in except blocks since it would create GC cycles and comprehension expressions because those behave like functions ; blocks like conditionals and loops don't have their own scope like in C-family languages such as C, C++, Java, etc.

JavaScript, famously, also historically used function scoping, with var, though block scoping has been introduced in ES6 with the keywords let and const. Python doesn't have declarations though, so there's no real way to properly retrofit that onto the language, so we're still stuck with function scoping. Whether it's a good feature or not is outside the scope (got it?) of this blogpost.

Python's scoping is quite coherent with the absence of declarations, how would you do something like this:

if cond:
	x = 5
else:
	x = 6

If variables had to be declared, you'd have to resort to tricks like writing x = None beforehand, but this messes

Stack Machines and Where To Find Them

Ever tried googling "recursion"?

screenshot of a Google search for "recursion". The results begin with "Did you mean: recursion"

There's something quite peculiar about recursion. Every developer and their dog has heard of it at some point, and most developers seem to have quite a strong opinion about it.

Sometimes, they were taught about it in college. Some old professor with a gray beard and funny words (the hell's a cons cell? why are you asking if I want to have s-expr with you?) made them write Lisp or Caml for a semester, growling at the slightest sign of loops or mutability to the poor student whose only experience with programming yet was Java-like OOP. Months spent writing factorials, linked lists, Fibonacci sequences, depth-first searches, and other algorithms with no real-world use whatsoever.

Other times, it was by misfortune. While writing code in any of their usual C-family enterprise-grade languages, they accidentally made a function call itself, and got greeted by a cryptic error message about something flowing over a stack. They looked it up on Google (or Yahoo? AltaVista? comp.lang.java?) and quickly learned that they had just stumbled upon some sort of arcane magic that, in addition to being a simply inefficient way of doing things was way too complicated for any

Dell laptops, ruining audio drivers in 2023

This is a "rant + fix" blog post. If you're looking for an interesting post, check out the other ones.

I own a Dell Latitude 3420. It works well, has good battery life, good keyboard, and lots of connectors (laptops today, ugh). I got the 1366x768 version though, so I bought a replacement 1080p display because 768p is... small.

Anyway.

From the beginning, there was a process constantly hogging up the CPU, idling at 25-30% usage, all the time. AC or battery, High Performance or High Efficiency mode, it was there. "WavesSysSvc":

WavesSysSvc Service Application

I looked around and found that it was part of the audio driver. Why the hell would my audio driver take up a third of my computing power?!

Additionally, the headphone jack would just... refuse to work. The only way to get it to work was to have my headphones/speakers plugged in when the computer started, but that's not viable, so for all intents and purposes it was broken.

I searched, browsed the Dell forums, saw that a lot of people were having that same problem, with no answer from Dell apart from "try updating your drivers using SupportAssist" (my drivers were up to date).

Then, stumbled onto

GPT: Straight Outta Copilot

💡
I'm not a lawyer.

You may have heard about this thing called GitHub Copilot. It's a tool that can be integrated inside an IDE and allows you to rip off code from licensed code hosted on GitHub. It has no intelligence of its own whatsoever, and any code is spits out must have been written as-is by a human developer at some point.

Oh, wait, sorry. That was Copilot writing a blog post from the perspective of someone that doesn't like it:

screenshot of VS Code with a prompt to Copilot to write a blog post intro, and outputting the exact paragraph you've read above

There have been lots of good and bad takes on Copilot these last months, since the release of its technical preview in June of 2021 and its general availability in June of 2022 on a subscription basis.

The recurring themes are, mostly:

  • Copilot is a copyright violation machine, since its dataset comes from code written by humans (i.e. intellectual property), and code produced by Copilot should constitute a derivative work.
  • Copilot is bad for education, because it offers no guarantee of the correctness of the code written. In the hands of beginners, it can give a false illusion of competency.
  • Copilot is bad for security, because it's just making the same security mistakes that were present in

Crabs All the Way Down: Running Rust on Logic Gates

💡
This article will discuss many topics, from CPU architecture design to historical shenanigans. Take a drink, it's downhill from there.

Even though the number has steadily decreased since the 90s, there are still many different and incompatible CPU architectures in use nowadays. Most computers use x86_64 and pretty much all mobile devices and recent Macs use some kind of ARM64-based ISA (instruction set architecture).

In specific fields, though, there are more exotic ones: most routers still use MIPS (for historical reasons), a roomful of developers use RISC-V, the PS3 used PowerPC, some servers 20 years ago used Itanium, and of course IBM still sells their S/390-based mainframes (now rebranded as z/Architecture). The embedded world has even more: AVR (used in Arduino), SuperH (Saturn, Dreamcast, Casio 9860 calculators), and the venerable 8051, an Intel chip from 1980 which is still being produced, sold and even extended by third parties.

All these architectures differ on their defining characteristics, the main ones being:

  • word size: 8, 16, 31, 32, 64 bits, sometimes more
  • design style: RISC (few instructions, simple operations), CISC (many instructions, performing complex operations, VLIW (long instructions, doing many things at once in parallel)
  • memory architecture: Harvard (separate