Wow someone have enouth fuck about the fuck I gove. To write an app to monitor it.
Pity I dont give enouth fucks to install it.
Wow someone have enouth fuck about the fuck I gove. To write an app to monitor it.
Pity I dont give enouth fucks to install it.
Because some users are putting that data on Linux. So they want Linux to be killed.
They can’t change grub. But they sure as hell can convince micro$org to search for and nuke it.
Of course no idea if this happened. Just answering why they would might want to.
Short term. The issue is going to be how to drill and inspect the water. While garrenteeing no earth organic contamination.
Untill folks are 100% sure no life at all exists. Any other considerationvwill be on hold.
And proving a negative is going to take ages before folks are happy.
Cool. At the time, it was one of the best. Although, I also liked sun-os.
I also worked with VMS a lot after uni. Hated using it. But had to respect the ideals behind it.
But watching the growth of Linux has been fantastic. In 2024. It does seem to have out evolved all the others. ( Evolved, defined as developed the ability to survive by becoming so freaking useful. )
I am starting to think it is time for a micro kernel version, though.
Was a few years later for me.
Not DMU by any chance?
Late 1990s my uni had unix workstations HPUX.
So all projects etc were expected to be done on those. Linux at the time was the easy way to do it from home.
By the time I left uni in 98. I was so used to it windows was a pain in the butt.
For most of the time since I have been almost 100% linux. With just a dual boot to sort some hardware/firmware crap.
Ham radio to this day. Many products can only do updates with windows.
Yeah that is often the issue.
It is rare a single company owns all the IP on code. So its common that companies are not releasing code because they cannot.
Yeah any reverse engineering of closed source code takes time. It’s a huge job on its own. Adding the need to avoid actions that may lead to legal issues.
Well yep, It’s very likely this may never round to a perfect replacement product.
But it still has value. For starters, it will encourage new open source projects to use it rather than the propria try version, long before it’s a direct replacement capable product.
So the effort is worth some excitement. At least a pack on the baxck and free beer for some of the guys trying.
Those of us who grew up during the IRA bombing find that statement odd.
Just of the top of my head discovered today.
Not a GUI as one exists. But a more configurable one as it is crap for visually impaired.
Rpi-imager gui dose not take theme indications for font size etc. Worse it has no configuration to change such thing.
Making it pretty much unsuable for anyone with poor vision.
Also it varies for each visually impaired indevidual. But dark mode is essential for some of ua.
So if your looking for small projects. Youd at least make me happy;)
Nice idea, I love. But you have to remember, those investigations cost huge time and money. When you consider the cost of a full-time staff over the 10 years, you include. Plus the cost of building a case against some of the largest cooperation. All before any court costs are considered.
We are likely better off having that money reinvested in preventing other companies from these practices.
Yep pretty much but on a larger scale.
1st please do not believe the bull that there was no problem. Many folks like me were paid to fix it before it was an issue. So other than a few companies, few saw the result, not because it did not exist. But because we were warned. People make jokes about the over panic. But if that had not happened, it would hav been years to fix, not days. Because without the panic, most corporations would have ignored it. Honestly, the panic scared shareholders. So boards of directors had to get experts to confirm the systems were compliant. And so much dependent crap was found running it was insane.
But the exaggerations of planes falling out of the sky etc. Was also bull. Most systems would have failed but BSOD would be rare, but code would crash and some works with errors shutting it down cleanly, some undiscovered until a short while later. As accounting or other errors showed up.
As other have said. The issue was that since the 1960s, computers were set up to treat years as 2 digits. So had no expectation to handle 2000 other than assume it was 1900. While from the early 90s most systems were built with ways to adapt to it. Not all were, as many were only developing top layer stuff. And many libraries etc had not been checked for this issue. Huge amounts of the infra of the world’s IT ran on legacy systems. Especially in the financial sector where I worked at the time.
The internet was a fairly new thing. So often stuff had been running for decades with no one needing to change it. Or having any real knowledge of how it was coded. So folks like me were forced to hunt through code or often replace systems that were badly documented or more often not at all.
A lot of modern software development practices grew out of discovering what a fucking mess can grow if people accept an “if it ain’t broke, don’t touch it” mentality.
Very much so. But the vulnerabilities do not tend to be discovered (by developers) until an attack happens. And auto updates are generally how the spread of attacks are limited.
Open source can help slightly. Due to both good and bad actors unrelated to development seeing the code. So it is more common for alerts to hit before attacks. But far from a fix all.
But generally, time between discovery and fix is a worry for big corps. So why auto updates have been accepted with less manual intervention than was common in the past.
Not OP. But that is how it used to be done. Issue is the attacks we have seen over the years. IE ransom attacks etc. Have made corps feel they needf to fixed and update instantly to avoid attacks. So they depend on the corp they pay for the software to test roll out.
Autoupdate is a 2 edged sword. Without it, attackers etc will take advantage of delays. With it. Well today.
If the just called it other.
It would gain a huge boost in desktop usage figures.
Thanks. That was exactly what I needed. I’ll look it up.
Agree. Although, completely is a broad term. Someone will always have an excuse not to.
But we def need to ensure we have a 100% clear idea of no ecology. Plus a clear mapping and review of all geology.
But honestly. Before actual terraforming. We need the tech to build an artificial magnetic field. Anything else, honesty can only be generating atmosphere within enclosed environments. As plant and human life will not be successful in open air. And any atmosphere will need constant replacement.
This moss can apparently survive for a short time. So would be a huge help. As minor breaches will leave the moss able to produce oxygen. Def bloody useful in any form of colonisation.
And let’s face it. Any true study of the planet. Will need people. Even if most is robotic, the distance from earth is such that having experts on the planet will make a huge difference. 6 to 44 mins for a round flight communication. While doable as we see. Cutting it to near 0 will really be essential for a full investigation. Even with way better AI then we have now.
Agreed. Most of us really do not think about this shit as often as we should. I know I am guilty of assuming he when typing. I know because I make an effort not to be. And notice how often I need to correct text. Being older than many developers. I just grew up with the assumptions. So like many my age needed my attention drawn to the societal indoctrination.
People politely pointing it out is important. As is people volunteering to help correct older documentation.
The direct numerics of moors law may not be definite.
But the principal it defines is. In the future computers will have much more power then they do now.
The reason modern GPUs use things like shaders etc is to allow them to archive massive manipulation of data in more efficient ways specific for the task desired.
Honestly this is why I mention time scale as the main thing that will make this possible. How modern gpus or other specialised processers do the task is less important then what the game code is asking the gpu to achieve.
The idea that at a unknown future date. The CPU GPUs or what ever future tech we have. will never be able to run fast enough to read current cpu or gpu instruction sets. And generate the effect defined using future techniques is not viable as an argument. The only questions are how long and is anyone going to have the motivation to reverse engineer the large but finite instruction sets used by secretive hardware corps today.
I will add as a narrowboater.
I found towpaths also have this issue with definition of surface.
I am legally blind. (Some vision but bad)
I have a few times tried to add more ditail to areas of towpath that will help the others like me know what to expect before mooring.
Seems anything that improves this will help in your issues as well.