How "trusted computing" will destroy both itself and the whole tech industry I have been developing a permacomputing operating system as a hobby and a learning project. It also has a TCP/IP stack and it can go online, so of course it can also be used as a server. To test the stability of the software stack I have kept an old 486 machine as a server. Almost all code in the system, excluding only most BIOS routines and the driver of the network card, is written by me. Two days ago I discovered that a certain Finnish ISP has started denying connection attempts to my 486 server from its customers. Their nameserver does not resolve its DNS, and connection attempts to the IP address also fail. Some router between the client and my server drops the packets. So apparently they have started blocking connections to servers which don't use some known server software and/or whose TCP/IP stack does not behave in a certain way. Some antivirus programs, for example Malwarebytes, have started flagging my server as a "malware spreader". If you try to visit my server's web page with a mainstream browser like Chrome, it will probably just show you a warning page saying that the site is "not safe" and refuse to let you go any further. So even such an innocent website like this one becomes a subject to this kind of mischief, and only because it is not hosted by the IT megacorporations and does not use their approved software stacks. The recent buzz about replacing the IBM PC compatible BIOS with UEFI has also caused all sorts of adverse effects to software freedom. In addition to the fact that UEFI does not have any runtime services and therefore the operating system has to have built-in drivers for every device in the computer, which is impossible to do for small indie operating system developers like myself, many UEFI implementations are also completely locked down so that they can run only the operating system that is pre-installed to the computer in the factory. It seems that the committee that designed the UEFI API does not actually understand what is an operating system, and that not every bootable program is an operating system. In many cases the user probably wants to run a simple diagnostics program that uses the I/O in such way that an operating system would only be in its way - that's why utilities like memtest86 are often bootable programs that work without an operating system. In the case of these programs it would also be impractical to include a whole driver stack for every existing hardware combinations to a small program that is only used to test memory. Also not every operating system is a huge monolithic complex piece of software like Windows and most Linux distributions are. Most operating systems are small hobby projects or non-general purpose systems that only exist to do some specific task which would be impractical or inefficient to do using a general-purpose OS like Linux or Windows. BIOS-like APIs exist to make it easier to create this kind of operating systems. These things are not a problem for only the people who use computers. It will also eventually destroy the whole tech industry and take the humanity back 50 - 100 years in level of technological advancement. Many low-level things - generally everything that involves deep knowledge - can only really be learned and understood by doing it yourself, by implementing things from scratch. That means that it must be possible to create you own operating system, your own TCP/IP stack and your own server software stack based on public specifications on how common open protocols like HTTP, TCP and IP work. To become a good kernel developer you must be allowed to create your own operating system and run it on your own physical computer. A virtual machine will not do. The I/O emulation in virtual machines is usually very imprecise and does not work at all like a real hardware does. Virtual machines are meant for running already existing operating system, not for developing new ones. New computers are already so locked-down that most operating system developers only ever test their OS on a virtual machine, and it shows. Most "tested and working" low-level code examples in OSDev don't actually work at all on a real physical computer. Often they are missing things like small delays that are needed on a real hardware but not on the emulated I/O of a virtual machine. Sometimes they just make weird assumptions that work only on the mostly indiverse platform of virtual computers. Why is this a problem for the big tech companies? Because the platforms like the already existing operating systems (Windows) also have to be maintained by someone. When learning these things has been made impossible, after a few years there will not be anyone who can maintain that stuff after the old greybeards are retired. The corporate investors who are currently in charge of doing decisions about these things care only about the profit of the currently ongoing quartal year - they don't think about the future and make sustainable decisions that would ensure the continuation of technological progress.