Discussion:
Intel's co-CEO claims retailers say Qualcomm-powered PCs have high return rates, points to new competitors with Arm chips coming in 2025
Add Reply
Joel
2024-12-14 00:35:05 UTC
Reply
Permalink
https://www.tomshardware.com/pc-components/cpus/intels-interim-co-ceo-claims-retailers-are-concerned-by-return-rate-of-qualcomm-powered-machines


Bottom line, people are having an issue with familiarity, returning a
Windows ARM device. It didn't operate quite the same way. Duh, it's
a good thing, it's superior tech for a laptop. Don't just return it.
--
Joel W. Crump

Amendment XIV
Section 1.

[...] No state shall make or enforce any law which shall
abridge the privileges or immunities of citizens of the
United States; nor shall any state deprive any person of
life, liberty, or property, without due process of law;
nor deny to any person within its jurisdiction the equal
protection of the laws.

Dobbs rewrites this, it is invalid precedent. States are
liable for denying needed abortions, e.g. TX.
rbowman
2024-12-14 00:47:26 UTC
Reply
Permalink
Post by Joel
https://www.tomshardware.com/pc-components/cpus/intels-interim-co-ceo-
claims-retailers-are-concerned-by-return-rate-of-qualcomm-powered-machines
Post by Joel
Bottom line, people are having an issue with familiarity, returning a
Windows ARM device. It didn't operate quite the same way. Duh, it's a
good thing, it's superior tech for a laptop. Don't just return it.
"Compatibility issues are, of course, thought to be the primary reason why
people return Snapdragon X-based systems to retailers."

RT redux. Hopefully the same vicious cycle won't repeat.

Consumers:
"This thing won't run my favorite Windows app. I'm taking it back!"

Developers:
"Those things aren't selling. We're not going to waste time writing native
Arm apps."

Industry Rags:
"Low sales suggests another Microsoft flop. Share price delining."
Joel
2024-12-14 01:05:47 UTC
Reply
Permalink
Post by Joel
Post by Joel
https://www.tomshardware.com/pc-components/cpus/intels-interim-co-ceo-
claims-retailers-are-concerned-by-return-rate-of-qualcomm-powered-machines
Post by Joel
Bottom line, people are having an issue with familiarity, returning a
Windows ARM device. It didn't operate quite the same way. Duh, it's a
good thing, it's superior tech for a laptop. Don't just return it.
"Compatibility issues are, of course, thought to be the primary reason why
people return Snapdragon X-based systems to retailers."
RT redux. Hopefully the same vicious cycle won't repeat.
"This thing won't run my favorite Windows app. I'm taking it back!"
"Those things aren't selling. We're not going to waste time writing native
Arm apps."
"Low sales suggests another Microsoft flop. Share price delining."
Lol, yeah, this is just growing pains. Once people actually realize
the benefit of ARM for a laptop, this trend will seem meaningless and
forgotten.
--
Joel W. Crump

Amendment XIV
Section 1.

[...] No state shall make or enforce any law which shall
abridge the privileges or immunities of citizens of the
United States; nor shall any state deprive any person of
life, liberty, or property, without due process of law;
nor deny to any person within its jurisdiction the equal
protection of the laws.

Dobbs rewrites this, it is invalid precedent. States are
liable for denying needed abortions, e.g. TX.
rbowman
2024-12-14 06:06:44 UTC
Reply
Permalink
Lol, yeah, this is just growing pains. Once people actually realize the
benefit of ARM for a laptop, this trend will seem meaningless and
forgotten.
Lenovo is claiming a 28 hour battery life. Is that a truly compelling
reason to go with ARM? Battery life has always been featured in laptop
reviews but they have to write about something. It's like a long article
about why the 2025 Toyota is so much better than the 2024 Toyota.

I may not be the typical laptop user. I can't remember when the two next
to me were unplugged other than the 6 day power outage in July. Sure, I
see people using laptops in the library but the new library had outlets
all over the place as does my favorite espresso place.

People have developed laptop usage patterns over the last 25 years or so.
Maybe ARM will succeed this time. Microsoft has tried to chase Apple
before with limited success. Apple was a little cagier too. 'Apple
Silicon' is a nice buzzword that doesn't suggest a new architecture to the
plebes.
Joel
2024-12-14 06:18:14 UTC
Reply
Permalink
Post by rbowman
Lol, yeah, this is just growing pains. Once people actually realize the
benefit of ARM for a laptop, this trend will seem meaningless and
forgotten.
Lenovo is claiming a 28 hour battery life. Is that a truly compelling
reason to go with ARM? Battery life has always been featured in laptop
reviews but they have to write about something. It's like a long article
about why the 2025 Toyota is so much better than the 2024 Toyota.
I may not be the typical laptop user. I can't remember when the two next
to me were unplugged other than the 6 day power outage in July. Sure, I
see people using laptops in the library but the new library had outlets
all over the place as does my favorite espresso place.
People have developed laptop usage patterns over the last 25 years or so.
Maybe ARM will succeed this time. Microsoft has tried to chase Apple
before with limited success. Apple was a little cagier too. 'Apple
Silicon' is a nice buzzword that doesn't suggest a new architecture to the
plebes.
I believe that the software will follow the new hardware.
--
Joel W. Crump

Amendment XIV
Section 1.

[...] No state shall make or enforce any law which shall
abridge the privileges or immunities of citizens of the
United States; nor shall any state deprive any person of
life, liberty, or property, without due process of law;
nor deny to any person within its jurisdiction the equal
protection of the laws.

Dobbs rewrites this, it is invalid precedent. States are
liable for denying needed abortions, e.g. TX.
CrudeSausage
2024-12-14 13:27:44 UTC
Reply
Permalink
Post by Joel
Post by rbowman
Lol, yeah, this is just growing pains. Once people actually realize the
benefit of ARM for a laptop, this trend will seem meaningless and
forgotten.
Lenovo is claiming a 28 hour battery life. Is that a truly compelling
reason to go with ARM? Battery life has always been featured in laptop
reviews but they have to write about something. It's like a long article
about why the 2025 Toyota is so much better than the 2024 Toyota.
I may not be the typical laptop user. I can't remember when the two next
to me were unplugged other than the 6 day power outage in July. Sure, I
see people using laptops in the library but the new library had outlets
all over the place as does my favorite espresso place.
People have developed laptop usage patterns over the last 25 years or so.
Maybe ARM will succeed this time. Microsoft has tried to chase Apple
before with limited success. Apple was a little cagier too. 'Apple
Silicon' is a nice buzzword that doesn't suggest a new architecture to the
plebes.
I believe that the software will follow the new hardware.
I hope you're right. Microsoft has a way of fucking incredibly simple
things up. Apple is already offering what Microsoft is promising so if
using ARM is that important, people should go that route.

Considering Microsoft's spotty history when it comes to transitions of
any kind, I would expect them to abandon ARM entirely if computer using
that architecture don't set the world on fire in their first year of
availability.
--
CrudeSausage
CrudeSausage
2024-12-14 13:23:22 UTC
Reply
Permalink
Post by rbowman
Lol, yeah, this is just growing pains. Once people actually realize the
benefit of ARM for a laptop, this trend will seem meaningless and
forgotten.
Lenovo is claiming a 28 hour battery life. Is that a truly compelling
reason to go with ARM? Battery life has always been featured in laptop
reviews but they have to write about something. It's like a long article
about why the 2025 Toyota is so much better than the 2024 Toyota.
I may not be the typical laptop user. I can't remember when the two next
to me were unplugged other than the 6 day power outage in July. Sure, I
see people using laptops in the library but the new library had outlets
all over the place as does my favorite espresso place.
People have developed laptop usage patterns over the last 25 years or so.
Maybe ARM will succeed this time. Microsoft has tried to chase Apple
before with limited success. Apple was a little cagier too. 'Apple
Silicon' is a nice buzzword that doesn't suggest a new architecture to the
plebes.
People probably don't need 24-hour battery life, but they will be
thrilled to have it either way. My gaming laptop boasted of ten-hour
battery life when I got it and it can definitely do that if you don't
use the horrible software ASUS provides you by default. I usually have a
power outlet near wherever I use the computer, but I didn't have one as
I was correcting yesterday and needed to look at the students' workbook
on the screen all the while entering grades. I was hoping that I'd have
enough juice to last the three hours I was there. In the end, I used
about 25% battery in viewing the original book on screen and listening
to a local radio station which talked about the Canadiens's horrible
loss the night before. In other words, decent battery life not only
allowed me to get the job done but to enjoy myself while doing boring work.

However, notice that I was apprehensive about the laptop having enough
battery life for the task despite knowing that it can usually handle ten
hours without issue. My habit of keeping the machine plugged makes me
fear not having the adapter around. Meanwhile, when I had a MacBook Air
M1, I was so used to not having an adapter around that I wouldn't have
feared not having enough battery life. Once you experience the latter,
you get addicted to it and don't want to return to a life where you feel
it necessary to lug around anything other than the laptop itself. The
battery life of machines is indeed going to be a major selling point in
the very near future.
--
CrudeSausage
-hh
2024-12-15 12:07:28 UTC
Reply
Permalink
Post by CrudeSausage
...
People probably don't need 24-hour battery life, but they will be
thrilled to have it either way. My gaming laptop boasted of ten-hour
battery life when I got it and it can definitely do that if you don't
use the horrible software ASUS provides you by default...
My take is that a "long hours" laptop life needs to not be BS in order
to actually be of benefit. I can recall getting a new Thinkpad at work
at one point which bragged about 8 hours ... and when just running
MS-Office stuff, reality was more like 2.5 hours.
Post by CrudeSausage
However, notice that I was apprehensive about the laptop having enough
battery life for the task despite knowing that it can usually handle ten
hours without issue. My habit of keeping the machine plugged makes me
fear not having the adapter around. Meanwhile, when I had a MacBook Air
M1, I was so used to not having an adapter around that I wouldn't have
feared not having enough battery life. Once you experience the latter,
you get addicted to it and don't want to return to a life where you feel
it necessary to lug around anything other than the laptop itself. The
battery life of machines is indeed going to be a major selling point in
the very near future.
Agreed. Apple's M hardware machines are already there. Its now up to
the WinTel world to catch up...

...and for Linux too, if they've not done a good job porting their OS &
Apps to the Apple M architecture.


-hh
CrudeSausage
2024-12-15 15:23:06 UTC
Reply
Permalink
Post by -hh
Post by CrudeSausage
...
People probably don't need 24-hour battery life, but they will be
thrilled to have it either way. My gaming laptop boasted of ten-hour
battery life when I got it and it can definitely do that if you don't
use the horrible software ASUS provides you by default...
My take is that a "long hours" laptop life needs to not be BS in order
to actually be of benefit.  I can recall getting a new Thinkpad at work
at one point which bragged about 8 hours ... and when just running MS-
Office stuff, reality was more like 2.5 hours.
That's why I usually take claims made by PC manufacturers with a grain
of salt. Even the ten hours claim made by ASUS for this machine didn't
seem too believable. It is, after all, quite theoretical. However, if
you allow the battery to charge to 100% and remove the software ASUS
itself supplies with the machine, you can get a discharge of about 5.5
to 7.5mW on average on a battery which charges to 76,000mW. The battery
will wear out fairly quickly though so that 76,000 number will turn to
74,000 to 72,000 to 67,000 to 60,000 very quickly. I imagine that if
they adopted slow charging from 80% to 100% like Apple does, they could
limit the wear though.
Post by -hh
Post by CrudeSausage
However, notice that I was apprehensive about the laptop having enough
battery life for the task despite knowing that it can usually handle
ten hours without issue. My habit of keeping the machine plugged makes
me fear not having the adapter around. Meanwhile, when I had a MacBook
Air M1, I was so used to not having an adapter around that I wouldn't
have feared not having enough battery life. Once you experience the
latter, you get addicted to it and don't want to return to a life
where you feel it necessary to lug around anything other than the
laptop itself. The battery life of machines is indeed going to be a
major selling point in the very near future.
Agreed.  Apple's M hardware machines are already there.  Its now up to
the WinTel world to catch up...
...and for Linux too, if they've not done a good job porting their OS &
Apps to the Apple M architecture.
I'd say it is unbelievable had Apple not already done it several times
before. You can count on them to get things right because they excel
across the board. They made the transition from 68000 to PowerPC
seamless for most users. Similarly, they made the transition from 9.2.2
to OS X easy by having the old OS load in the background to run the
aging apps on the new OS. Somehow, they even made the transition from
PowerPC to Intel mostly transparent to users with only performance being
an issue (to be expected). Clearly, when they decided to move to their
own processors, people with a good knowledge of Apple's history could
not help but be excited because there was no way they could screw it up
and, frankly, they haven't. Even though I no longer have one (mostly out
of a concern that the machine would die from the SSD wearing out), there
is no doubt in my mind that the MacBook is a superior work machine than
any PC I could buy because of its complete lack of distractions, its
consistently stellar performance, its phenomenal battery life and its
rock-solid stability.
--
CrudeSausage
CrudeSausage
2024-12-14 12:56:30 UTC
Reply
Permalink
Post by Joel
Post by Joel
Post by Joel
https://www.tomshardware.com/pc-components/cpus/intels-interim-co-ceo-
claims-retailers-are-concerned-by-return-rate-of-qualcomm-powered-machines
Post by Joel
Bottom line, people are having an issue with familiarity, returning a
Windows ARM device. It didn't operate quite the same way. Duh, it's a
good thing, it's superior tech for a laptop. Don't just return it.
"Compatibility issues are, of course, thought to be the primary reason why
people return Snapdragon X-based systems to retailers."
RT redux. Hopefully the same vicious cycle won't repeat.
"This thing won't run my favorite Windows app. I'm taking it back!"
"Those things aren't selling. We're not going to waste time writing native
Arm apps."
"Low sales suggests another Microsoft flop. Share price delining."
Lol, yeah, this is just growing pains. Once people actually realize
the benefit of ARM for a laptop, this trend will seem meaningless and
forgotten.
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
--
CrudeSausage
RonB
2024-12-14 15:44:49 UTC
Reply
Permalink
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
While I agree that most people want longer battery life for their laptops, I
really don't think the cost of charging a laptop is that big of a concern.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-14 17:55:34 UTC
Reply
Permalink
Post by RonB
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
While I agree that most people want longer battery life for their laptops, I
really don't think the cost of charging a laptop is that big of a concern.
Not to a person who lives in an area where electricity is cheap.
However, it is only going to become more expensive in places like Europe
where its production depend on a resource acquired from Russia. The same
way they switched to fuel-efficient or electric cars to lower their
reliance on gasoline, they are probably going to switch to
energy-efficient machines to reduce their need for electricity altogether.
--
CrudeSausage
RonB
2024-12-15 07:23:22 UTC
Reply
Permalink
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
While I agree that most people want longer battery life for their laptops, I
really don't think the cost of charging a laptop is that big of a concern.
Not to a person who lives in an area where electricity is cheap.
However, it is only going to become more expensive in places like Europe
where its production depend on a resource acquired from Russia. The same
way they switched to fuel-efficient or electric cars to lower their
reliance on gasoline, they are probably going to switch to
energy-efficient machines to reduce their need for electricity altogether.
If things are getting that dire in Europe they're going to have to learn
to live without computers at all.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-15 15:12:13 UTC
Reply
Permalink
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
While I agree that most people want longer battery life for their laptops, I
really don't think the cost of charging a laptop is that big of a concern.
Not to a person who lives in an area where electricity is cheap.
However, it is only going to become more expensive in places like Europe
where its production depend on a resource acquired from Russia. The same
way they switched to fuel-efficient or electric cars to lower their
reliance on gasoline, they are probably going to switch to
energy-efficient machines to reduce their need for electricity altogether.
If things are getting that dire in Europe they're going to have to learn
to live without computers at all.
If this were the 80s and Europe were facing these issues, I imagine that
either Atari or Commodore would have produced a very efficient computer
which would only need to be charged once daily. Let's not forget how
popular the ST and the Amiga were over there while they were failing
miserably in North America. Because both companies are dead, the most
likely scenario is that they will move to the efficient machines made by
Apple or equipped with Qualcomm's processors. I do not think that their
energy crisis is going to get better anytime soon.
--
CrudeSausage
RonB
2024-12-16 10:27:19 UTC
Reply
Permalink
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
While I agree that most people want longer battery life for their laptops, I
really don't think the cost of charging a laptop is that big of a concern.
Not to a person who lives in an area where electricity is cheap.
However, it is only going to become more expensive in places like Europe
where its production depend on a resource acquired from Russia. The same
way they switched to fuel-efficient or electric cars to lower their
reliance on gasoline, they are probably going to switch to
energy-efficient machines to reduce their need for electricity altogether.
If things are getting that dire in Europe they're going to have to learn
to live without computers at all.
If this were the 80s and Europe were facing these issues, I imagine that
either Atari or Commodore would have produced a very efficient computer
which would only need to be charged once daily. Let's not forget how
popular the ST and the Amiga were over there while they were failing
miserably in North America. Because both companies are dead, the most
likely scenario is that they will move to the efficient machines made by
Apple or equipped with Qualcomm's processors. I do not think that their
energy crisis is going to get better anytime soon.
I'm sorry, but I'm skeptical that the electricity needed to charge a laptop
is that big of a concern, even in Europe.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-16 15:59:04 UTC
Reply
Permalink
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
While I agree that most people want longer battery life for their laptops, I
really don't think the cost of charging a laptop is that big of a concern.
Not to a person who lives in an area where electricity is cheap.
However, it is only going to become more expensive in places like Europe
where its production depend on a resource acquired from Russia. The same
way they switched to fuel-efficient or electric cars to lower their
reliance on gasoline, they are probably going to switch to
energy-efficient machines to reduce their need for electricity altogether.
If things are getting that dire in Europe they're going to have to learn
to live without computers at all.
If this were the 80s and Europe were facing these issues, I imagine that
either Atari or Commodore would have produced a very efficient computer
which would only need to be charged once daily. Let's not forget how
popular the ST and the Amiga were over there while they were failing
miserably in North America. Because both companies are dead, the most
likely scenario is that they will move to the efficient machines made by
Apple or equipped with Qualcomm's processors. I do not think that their
energy crisis is going to get better anytime soon.
I'm sorry, but I'm skeptical that the electricity needed to charge a laptop
is that big of a concern, even in Europe.
In that case, you should look at how Germany's economy is tanking,
specifically the result of a lack of cheap oil coming in from Russia.
You can imagine that the smaller supply of oil will result in electrical
production being more expensive and for the power bills to be much
higher for the average German. As a result, they are not as likely as
they once might have been to buy the powerful PC which requires 800W of
power to play a game every hour.
--
CrudeSausage
RonB
2024-12-17 08:13:12 UTC
Reply
Permalink
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
While I agree that most people want longer battery life for their laptops, I
really don't think the cost of charging a laptop is that big of a concern.
Not to a person who lives in an area where electricity is cheap.
However, it is only going to become more expensive in places like Europe
where its production depend on a resource acquired from Russia. The same
way they switched to fuel-efficient or electric cars to lower their
reliance on gasoline, they are probably going to switch to
energy-efficient machines to reduce their need for electricity altogether.
If things are getting that dire in Europe they're going to have to learn
to live without computers at all.
If this were the 80s and Europe were facing these issues, I imagine that
either Atari or Commodore would have produced a very efficient computer
which would only need to be charged once daily. Let's not forget how
popular the ST and the Amiga were over there while they were failing
miserably in North America. Because both companies are dead, the most
likely scenario is that they will move to the efficient machines made by
Apple or equipped with Qualcomm's processors. I do not think that their
energy crisis is going to get better anytime soon.
I'm sorry, but I'm skeptical that the electricity needed to charge a laptop
is that big of a concern, even in Europe.
In that case, you should look at how Germany's economy is tanking,
specifically the result of a lack of cheap oil coming in from Russia.
You can imagine that the smaller supply of oil will result in electrical
production being more expensive and for the power bills to be much
higher for the average German. As a result, they are not as likely as
they once might have been to buy the powerful PC which requires 800W of
power to play a game every hour.
I don't have to "imagine" that the lack of cheap Russian gas is hurting
Germany's economy (that's plain to see every day in the international news).
I'm just having trouble imagining that this is resulting in angst about the
amount of electricity required to charge a laptop.

I purposely use low power laptops and micro desktops because it's all I need
and I don't like the background sound of fans. These all run Intel CPUs
(except for the Wyse 5060 thin client desktop — it uses a low power AMD
CPU).

And, as usual, the standard disclaimer, I don't play Windows' video games or
use high-end (watt gobbling) GPUs. I'm not sure, though, that ARM chips will
be running these games in the future. (I guess we'll see.)
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-17 13:57:42 UTC
Reply
Permalink
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
While I agree that most people want longer battery life for their laptops, I
really don't think the cost of charging a laptop is that big of a concern.
Not to a person who lives in an area where electricity is cheap.
However, it is only going to become more expensive in places like Europe
where its production depend on a resource acquired from Russia. The same
way they switched to fuel-efficient or electric cars to lower their
reliance on gasoline, they are probably going to switch to
energy-efficient machines to reduce their need for electricity altogether.
If things are getting that dire in Europe they're going to have to learn
to live without computers at all.
If this were the 80s and Europe were facing these issues, I imagine that
either Atari or Commodore would have produced a very efficient computer
which would only need to be charged once daily. Let's not forget how
popular the ST and the Amiga were over there while they were failing
miserably in North America. Because both companies are dead, the most
likely scenario is that they will move to the efficient machines made by
Apple or equipped with Qualcomm's processors. I do not think that their
energy crisis is going to get better anytime soon.
I'm sorry, but I'm skeptical that the electricity needed to charge a laptop
is that big of a concern, even in Europe.
In that case, you should look at how Germany's economy is tanking,
specifically the result of a lack of cheap oil coming in from Russia.
You can imagine that the smaller supply of oil will result in electrical
production being more expensive and for the power bills to be much
higher for the average German. As a result, they are not as likely as
they once might have been to buy the powerful PC which requires 800W of
power to play a game every hour.
I don't have to "imagine" that the lack of cheap Russian gas is hurting
Germany's economy (that's plain to see every day in the international news).
I'm just having trouble imagining that this is resulting in angst about the
amount of electricity required to charge a laptop.
If the price you pay for electricity doubles, you are likely to look at
the devices in your house and make changes in the kind of machine you
buy. The promise of charging once a day rather than keeping a machine
plugged is likely to be a benefit to a European. The people of North
America probably won't care as much since power is cheap here.
Post by RonB
I purposely use low power laptops and micro desktops because it's all I need
and I don't like the background sound of fans. These all run Intel CPUs
(except for the Wyse 5060 thin client desktop — it uses a low power AMD
CPU).
And, as usual, the standard disclaimer, I don't play Windows' video games or
use high-end (watt gobbling) GPUs. I'm not sure, though, that ARM chips will
be running these games in the future. (I guess we'll see.)
ARM might, but I don't care to stick around to find out. At best, I
would imagine that ARM will play today's games as well as today's x86-64
PCs around 2027 or so through some compatibility layer. If it happens
sooner, all the better.
--
CrudeSausage
RonB
2024-12-17 20:30:50 UTC
Reply
Permalink
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
While I agree that most people want longer battery life for their laptops, I
really don't think the cost of charging a laptop is that big of a concern.
Not to a person who lives in an area where electricity is cheap.
However, it is only going to become more expensive in places like Europe
where its production depend on a resource acquired from Russia. The same
way they switched to fuel-efficient or electric cars to lower their
reliance on gasoline, they are probably going to switch to
energy-efficient machines to reduce their need for electricity altogether.
If things are getting that dire in Europe they're going to have to learn
to live without computers at all.
If this were the 80s and Europe were facing these issues, I imagine that
either Atari or Commodore would have produced a very efficient computer
which would only need to be charged once daily. Let's not forget how
popular the ST and the Amiga were over there while they were failing
miserably in North America. Because both companies are dead, the most
likely scenario is that they will move to the efficient machines made by
Apple or equipped with Qualcomm's processors. I do not think that their
energy crisis is going to get better anytime soon.
I'm sorry, but I'm skeptical that the electricity needed to charge a laptop
is that big of a concern, even in Europe.
In that case, you should look at how Germany's economy is tanking,
specifically the result of a lack of cheap oil coming in from Russia.
You can imagine that the smaller supply of oil will result in electrical
production being more expensive and for the power bills to be much
higher for the average German. As a result, they are not as likely as
they once might have been to buy the powerful PC which requires 800W of
power to play a game every hour.
I don't have to "imagine" that the lack of cheap Russian gas is hurting
Germany's economy (that's plain to see every day in the international news).
I'm just having trouble imagining that this is resulting in angst about the
amount of electricity required to charge a laptop.
If the price you pay for electricity doubles, you are likely to look at
the devices in your house and make changes in the kind of machine you
buy. The promise of charging once a day rather than keeping a machine
plugged is likely to be a benefit to a European. The people of North
America probably won't care as much since power is cheap here.
Hypotheticals. I'll remain skeptical that this will be a major issue.
(Unless, of course, there is no power at all — which may be a reality in
Europe if they keep going down the destructive paths they've chosen. In that
case keeping food from spoiling will probably take priority over laptop
charging — of any kind).
Post by CrudeSausage
Post by RonB
I purposely use low power laptops and micro desktops because it's all I need
and I don't like the background sound of fans. These all run Intel CPUs
(except for the Wyse 5060 thin client desktop — it uses a low power AMD
CPU).
And, as usual, the standard disclaimer, I don't play Windows' video games or
use high-end (watt gobbling) GPUs. I'm not sure, though, that ARM chips will
be running these games in the future. (I guess we'll see.)
ARM might, but I don't care to stick around to find out. At best, I
would imagine that ARM will play today's games as well as today's x86-64
PCs around 2027 or so through some compatibility layer. If it happens
sooner, all the better.
I'm guessing the power required to run Windows complex video games will not
fit in ARM's low-power "wheelhouse." But we'll see. As I've mentioned (many
times now) I'm not a game player.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-17 23:01:02 UTC
Reply
Permalink
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
While I agree that most people want longer battery life for their laptops, I
really don't think the cost of charging a laptop is that big of a concern.
Not to a person who lives in an area where electricity is cheap.
However, it is only going to become more expensive in places like Europe
where its production depend on a resource acquired from Russia. The same
way they switched to fuel-efficient or electric cars to lower their
reliance on gasoline, they are probably going to switch to
energy-efficient machines to reduce their need for electricity altogether.
If things are getting that dire in Europe they're going to have to learn
to live without computers at all.
If this were the 80s and Europe were facing these issues, I imagine that
either Atari or Commodore would have produced a very efficient computer
which would only need to be charged once daily. Let's not forget how
popular the ST and the Amiga were over there while they were failing
miserably in North America. Because both companies are dead, the most
likely scenario is that they will move to the efficient machines made by
Apple or equipped with Qualcomm's processors. I do not think that their
energy crisis is going to get better anytime soon.
I'm sorry, but I'm skeptical that the electricity needed to charge a laptop
is that big of a concern, even in Europe.
In that case, you should look at how Germany's economy is tanking,
specifically the result of a lack of cheap oil coming in from Russia.
You can imagine that the smaller supply of oil will result in electrical
production being more expensive and for the power bills to be much
higher for the average German. As a result, they are not as likely as
they once might have been to buy the powerful PC which requires 800W of
power to play a game every hour.
I don't have to "imagine" that the lack of cheap Russian gas is hurting
Germany's economy (that's plain to see every day in the international news).
I'm just having trouble imagining that this is resulting in angst about the
amount of electricity required to charge a laptop.
If the price you pay for electricity doubles, you are likely to look at
the devices in your house and make changes in the kind of machine you
buy. The promise of charging once a day rather than keeping a machine
plugged is likely to be a benefit to a European. The people of North
America probably won't care as much since power is cheap here.
Hypotheticals. I'll remain skeptical that this will be a major issue.
(Unless, of course, there is no power at all — which may be a reality in
Europe if they keep going down the destructive paths they've chosen. In that
case keeping food from spoiling will probably take priority over laptop
charging — of any kind).
Only as long as whatever work you do doesn't depend on you having a
computer.
Post by RonB
Post by CrudeSausage
Post by RonB
I purposely use low power laptops and micro desktops because it's all I need
and I don't like the background sound of fans. These all run Intel CPUs
(except for the Wyse 5060 thin client desktop — it uses a low power AMD
CPU).
And, as usual, the standard disclaimer, I don't play Windows' video games or
use high-end (watt gobbling) GPUs. I'm not sure, though, that ARM chips will
be running these games in the future. (I guess we'll see.)
ARM might, but I don't care to stick around to find out. At best, I
would imagine that ARM will play today's games as well as today's x86-64
PCs around 2027 or so through some compatibility layer. If it happens
sooner, all the better.
I'm guessing the power required to run Windows complex video games will not
fit in ARM's low-power "wheelhouse." But we'll see. As I've mentioned (many
times now) I'm not a game player.
ARM being low-power doesn't mean that it is low-performance. As the
Apple processors have shown, they're a lot more powerful than x86-64
processors on single-core applications. They're only worse on multi-core
and even then, not by much. ARM basically allows people to have
performance like they currently have but through much less battery power.
--
CrudeSausage
Lawrence D'Oliveiro
2024-12-18 06:00:32 UTC
Reply
Permalink
Post by CrudeSausage
ARM being low-power doesn't mean that it is low-performance.
Fujitsu’s chips are proof of that
<https://www.tomshardware.com/pc-components/cpus/fujitsu-flaunts-144-core-monaka-cpu-2nm-and-5nm-chiplets-soic-and-cowos-packaging>.
And without the upgradeability limitations of Apple, too.
CrudeSausage
2024-12-18 13:50:15 UTC
Reply
Permalink
Post by Lawrence D'Oliveiro
Post by CrudeSausage
ARM being low-power doesn't mean that it is low-performance.
Fujitsu’s chips are proof of that
<https://www.tomshardware.com/pc-components/cpus/fujitsu-flaunts-144-core-monaka-cpu-2nm-and-5nm-chiplets-soic-and-cowos-packaging>.
And without the upgradeability limitations of Apple, too.
Except that Apple already has a ton of the software running natively on
ARM and whichever hasn't been converted runs almost as well as the
native stuff. The PC side has a ton of promises that it will deliver
something better in the future than what Apple sells _today_. Those
promises don't produce the hope in me that they did twenty years ago
because I've been disappointed a number of times.
--
CrudeSausage
Lawrence D'Oliveiro
2024-12-19 02:04:50 UTC
Reply
Permalink
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
ARM being low-power doesn't mean that it is low-performance.
Fujitsu’s chips are proof of that
<https://www.tomshardware.com/pc-components/cpus/fujitsu-flaunts-144-core-monaka-cpu-2nm-and-5nm-chiplets-soic-and-cowos-packaging>.
And without the upgradeability limitations of Apple, too.
Except that Apple already has a ton of the software running natively on
ARM ...
Sure, if Apple stuff is sufficient for you. Linux stuff runs natively
on ARM and a bunch of other architectures, as well.
CrudeSausage
2024-12-19 13:45:43 UTC
Reply
Permalink
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
ARM being low-power doesn't mean that it is low-performance.
Fujitsu’s chips are proof of that
<https://www.tomshardware.com/pc-components/cpus/fujitsu-flaunts-144-core-monaka-cpu-2nm-and-5nm-chiplets-soic-and-cowos-packaging>.
And without the upgradeability limitations of Apple, too.
Except that Apple already has a ton of the software running natively on
ARM ...
Sure, if Apple stuff is sufficient for you. Linux stuff runs natively
on ARM and a bunch of other architectures, as well.
It does, but Linux's stuff is already hard to sell on x86-64 so you can
imagine how alluring it is on ARM. Nevertheless, if I were using an
ARM-based PC at the moment, I would be using Linux on it and watching to
see if Microsoft _eventually_ gets its act together. I doubt that it
ever will which is why I'm pushing people to consider Apple if ARM is
what they're looking for.
--
CrudeSausage
Lawrence D'Oliveiro
2024-12-19 19:53:44 UTC
Reply
Permalink
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Except that Apple already has a ton of the software running natively
on ARM ...
Sure, if Apple stuff is sufficient for you. Linux stuff runs natively
on ARM and a bunch of other architectures, as well.
It does, but Linux's stuff is already hard to sell on x86-64 so you can
imagine how alluring it is on ARM.
Linux is the only platform that offers a full suite of software native for
both ARM and x86 (both 32- and 64-bit). Nobody else does.
chrisv
2024-12-19 20:34:46 UTC
Reply
Permalink
Post by Lawrence D'Oliveiro
Linux is the only platform that offers a full suite of software native for
both ARM and x86 (both 32- and 64-bit). Nobody else does.
Payware vendors can't compete with the efficiency of FOSS.
--
'You (the lame Linux "community") have always decided on one "best
distro".' - DumFSck, lying shamelessly
CrudeSausage
2024-12-19 22:15:51 UTC
Reply
Permalink
Post by chrisv
Post by Lawrence D'Oliveiro
Linux is the only platform that offers a full suite of software native for
both ARM and x86 (both 32- and 64-bit). Nobody else does.
Payware vendors can't compete with the efficiency of FOSS.
They can: they eventually implement features that the open-source world
will copy and implement later in a more rudimentary form.
--
CrudeSausage
chrisv
2024-12-20 00:28:07 UTC
Reply
Permalink
Post by CrudeSausage
Post by chrisv
Post by Lawrence D'Oliveiro
Linux is the only platform that offers a full suite of software native for
both ARM and x86 (both 32- and 64-bit). Nobody else does.
Payware vendors can't compete with the efficiency of FOSS.
They can: they eventually implement features that the open-source world
will copy and implement later in a more rudimentary form.
So where's the payware OS and software for, say, Raspbarry Pi?
--
"two shakes of Liarmutt's pee-pee." - "Hadron"
CrudeSausage
2024-12-20 01:08:03 UTC
Reply
Permalink
Post by chrisv
Post by CrudeSausage
Post by chrisv
Post by Lawrence D'Oliveiro
Linux is the only platform that offers a full suite of software native for
both ARM and x86 (both 32- and 64-bit). Nobody else does.
Payware vendors can't compete with the efficiency of FOSS.
They can: they eventually implement features that the open-source world
will copy and implement later in a more rudimentary form.
So where's the payware OS and software for, say, Raspbarry Pi?
Where are the profits in producing an operating system and software for
such hardware?
--
CrudeSausage
Lawrence D'Oliveiro
2024-12-20 01:16:38 UTC
Reply
Permalink
Post by CrudeSausage
Where are the profits in producing an operating system and software for
such hardware?
There are no profits to be made in operating systems any more, for any
hardware. You see this in the deteriorating quality of Microsoft Windows.
CrudeSausage
2024-12-20 13:37:45 UTC
Reply
Permalink
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Where are the profits in producing an operating system and software for
such hardware?
There are no profits to be made in operating systems any more, for any
hardware. You see this in the deteriorating quality of Microsoft Windows.
Except that 11 is better than 10 and 10, depending on who you ask, is
better than 8.1. No one will say that they preferred 8.1 over 7 but it
was an improvement for anyone who had a touchscreen. Similarly, 7 was
more stable than Vista. The only Windows versions that are definitely
worse than their predecessors are Millennium Edition and Vista, but
Vista was actually superior to XP in a number of ways that were ignored
by people unsatisfied by its poor performance on modest hardware.
--
CrudeSausage
RonB
2024-12-20 18:31:46 UTC
Reply
Permalink
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Where are the profits in producing an operating system and software for
such hardware?
There are no profits to be made in operating systems any more, for any
hardware. You see this in the deteriorating quality of Microsoft Windows.
Except that 11 is better than 10 and 10, depending on who you ask, is
better than 8.1. No one will say that they preferred 8.1 over 7 but it
was an improvement for anyone who had a touchscreen. Similarly, 7 was
more stable than Vista. The only Windows versions that are definitely
worse than their predecessors are Millennium Edition and Vista, but
Vista was actually superior to XP in a number of ways that were ignored
by people unsatisfied by its poor performance on modest hardware.
Sure 11 is "better," if you like even more advertising in your OS and AI
horse crap. (But I admit it does updates slightly better... only a couple
hours instead of four or five.)

And you're forgetting Windows 8, which was definitely worse than 7 (and
almost completely ignored by Windows users). Windows 8.1 was basically
Windows 10.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-20 18:42:25 UTC
Reply
Permalink
Post by RonB
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Where are the profits in producing an operating system and software for
such hardware?
There are no profits to be made in operating systems any more, for any
hardware. You see this in the deteriorating quality of Microsoft Windows.
Except that 11 is better than 10 and 10, depending on who you ask, is
better than 8.1. No one will say that they preferred 8.1 over 7 but it
was an improvement for anyone who had a touchscreen. Similarly, 7 was
more stable than Vista. The only Windows versions that are definitely
worse than their predecessors are Millennium Edition and Vista, but
Vista was actually superior to XP in a number of ways that were ignored
by people unsatisfied by its poor performance on modest hardware.
Sure 11 is "better," if you like even more advertising in your OS and AI
horse crap. (But I admit it does updates slightly better... only a couple
hours instead of four or five.)
Perhaps there is lots of advertising in Windows 11 if you are in the
United States, but I don't see anything but the weather and news items
here.
Post by RonB
And you're forgetting Windows 8, which was definitely worse than 7 (and
almost completely ignored by Windows users). Windows 8.1 was basically
Windows 10.
Windows 8.1 was much snappier than 10 in everyday use but just as bad
with updates. 8 only sucked at first if you had no idea where the Start
button went and that it was replaced by simply navigating to the
bottom-left corner. Once you got that out of the way, it was fine.
However, it gave too much importance to the modern apps.
--
CrudeSausage
chrisv
2024-12-20 03:03:50 UTC
Reply
Permalink
Post by CrudeSausage
Post by chrisv
Post by CrudeSausage
Post by chrisv
Payware vendors can't compete with the efficiency of FOSS.
They can: they eventually implement features that the open-source world
will copy and implement later in a more rudimentary form.
So where's the payware OS and software for, say, Raspbarry Pi?
Where are the profits in producing an operating system and software for
such hardware?
Err... It's due to the efficiency of FOSS, that there isn't any.

Sheesh.
RonB
2024-12-20 05:56:41 UTC
Reply
Permalink
Post by CrudeSausage
Post by chrisv
Post by CrudeSausage
Post by chrisv
Post by Lawrence D'Oliveiro
Linux is the only platform that offers a full suite of software native for
both ARM and x86 (both 32- and 64-bit). Nobody else does.
Payware vendors can't compete with the efficiency of FOSS.
They can: they eventually implement features that the open-source world
will copy and implement later in a more rudimentary form.
So where's the payware OS and software for, say, Raspbarry Pi?
Where are the profits in producing an operating system and software for
such hardware?
Open source is not profit driven. The Raspberry Pi computers make a lot of
money for small start-up companies. But they're making hardware, not
software.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-20 13:58:12 UTC
Reply
Permalink
Post by RonB
Post by CrudeSausage
Post by chrisv
Post by CrudeSausage
Post by chrisv
Post by Lawrence D'Oliveiro
Linux is the only platform that offers a full suite of software native for
both ARM and x86 (both 32- and 64-bit). Nobody else does.
Payware vendors can't compete with the efficiency of FOSS.
They can: they eventually implement features that the open-source world
will copy and implement later in a more rudimentary form.
So where's the payware OS and software for, say, Raspbarry Pi?
Where are the profits in producing an operating system and software for
such hardware?
Open source is not profit driven. The Raspberry Pi computers make a lot of
money for small start-up companies. But they're making hardware, not
software.
And this is why GNOME is a few months away from going bankrupt as is
Mozilla after Google's legal loss. This is also why KDE nags you for
donations when you first use the desktop environment. Say what you will
about open-source users, but they're pretty bad at giving money to the
organizations that produce the software they use. This is also why the
most talented programmers working on open-source eventually give up and
produce software that they can sell.
--
CrudeSausage
RonB
2024-12-20 18:28:27 UTC
Reply
Permalink
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by chrisv
Post by CrudeSausage
Post by chrisv
Post by Lawrence D'Oliveiro
Linux is the only platform that offers a full suite of software native for
both ARM and x86 (both 32- and 64-bit). Nobody else does.
Payware vendors can't compete with the efficiency of FOSS.
They can: they eventually implement features that the open-source world
will copy and implement later in a more rudimentary form.
So where's the payware OS and software for, say, Raspbarry Pi?
Where are the profits in producing an operating system and software for
such hardware?
Open source is not profit driven. The Raspberry Pi computers make a lot of
money for small start-up companies. But they're making hardware, not
software.
And this is why GNOME is a few months away from going bankrupt as is
Mozilla after Google's legal loss. This is also why KDE nags you for
donations when you first use the desktop environment. Say what you will
about open-source users, but they're pretty bad at giving money to the
organizations that produce the software they use. This is also why the
most talented programmers working on open-source eventually give up and
produce software that they can sell.
A quick lookup on the GNOME Foundation (not GNOME's) supposed upcoming
bankruptcy. And exchange from two posters on Reddit...

Who started this fake news saying that GNOME would "go bankrupt"? I'm
really curious. We're talking about a non-profit organization.

Lunduke. It's always f-ing Lunduke.

Hope someone sues his ass one day for this type of shit.

I'm guessing that the Lunduke moron was your "source" (so to speak).
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-20 18:39:09 UTC
Reply
Permalink
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by chrisv
Post by CrudeSausage
Post by chrisv
Post by Lawrence D'Oliveiro
Linux is the only platform that offers a full suite of software native for
both ARM and x86 (both 32- and 64-bit). Nobody else does.
Payware vendors can't compete with the efficiency of FOSS.
They can: they eventually implement features that the open-source world
will copy and implement later in a more rudimentary form.
So where's the payware OS and software for, say, Raspbarry Pi?
Where are the profits in producing an operating system and software for
such hardware?
Open source is not profit driven. The Raspberry Pi computers make a lot of
money for small start-up companies. But they're making hardware, not
software.
And this is why GNOME is a few months away from going bankrupt as is
Mozilla after Google's legal loss. This is also why KDE nags you for
donations when you first use the desktop environment. Say what you will
about open-source users, but they're pretty bad at giving money to the
organizations that produce the software they use. This is also why the
most talented programmers working on open-source eventually give up and
produce software that they can sell.
A quick lookup on the GNOME Foundation (not GNOME's) supposed upcoming
bankruptcy. And exchange from two posters on Reddit...
Who started this fake news saying that GNOME would "go bankrupt"? I'm
really curious. We're talking about a non-profit organization.
Lunduke. It's always f-ing Lunduke.
Hope someone sues his ass one day for this type of shit.
I'm guessing that the Lunduke moron was your "source" (so to speak).
It was indeed Lunduke, but he's basing his report on GNOME's own
numbers. I hope that you'll apologize when it indeed goes bankrupt not
too long from now.
--
CrudeSausage
CrudeSausage
2024-12-20 18:56:22 UTC
Reply
Permalink
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by chrisv
Post by CrudeSausage
Post by chrisv
Post by Lawrence D'Oliveiro
Linux is the only platform that offers a full suite of software native for
both ARM and x86 (both 32- and 64-bit). Nobody else does.
Payware vendors can't compete with the efficiency of FOSS.
They can: they eventually implement features that the open-source world
will copy and implement later in a more rudimentary form.
So where's the payware OS and software for, say, Raspbarry Pi?
Where are the profits in producing an operating system and software for
such hardware?
Open source is not profit driven. The Raspberry Pi computers make a lot of
money for small start-up companies. But they're making hardware, not
software.
And this is why GNOME is a few months away from going bankrupt as is
Mozilla after Google's legal loss. This is also why KDE nags you for
donations when you first use the desktop environment. Say what you will
about open-source users, but they're pretty bad at giving money to the
organizations that produce the software they use. This is also why the
most talented programmers working on open-source eventually give up and
produce software that they can sell.
A quick lookup on the GNOME Foundation (not GNOME's) supposed upcoming
bankruptcy. And exchange from two posters on Reddit...
Who started this fake news saying that GNOME would "go bankrupt"? I'm
really curious. We're talking about a non-profit organization.
Lunduke. It's always f-ing Lunduke.
Hope someone sues his ass one day for this type of shit.
I'm guessing that the Lunduke moron was your "source" (so to speak).
I just want to add that Lunduke is currently hated by the woke idiots
running Linux because he tells the truth. He's the one who revealed Red
Hat and IBM's discriminatoty practises, he's also the one who revealed
how Mozilla is turning into its own woke organization. Everyone is
refusing to speak about the elephant in the room but he's pointing it
out. Feel free to read his entire article with the numbers GNOME itself
has provided and show where he is wrong, especially now that GNOME laid
off 33% of its staff.

Face it: you're being played by the same bearded men who are telling you
they're women.
--
CrudeSausage
rbowman
2024-12-20 01:19:09 UTC
Reply
Permalink
Post by chrisv
Post by CrudeSausage
Post by chrisv
Post by Lawrence D'Oliveiro
Linux is the only platform that offers a full suite of software
native for both ARM and x86 (both 32- and 64-bit). Nobody else does.
Payware vendors can't compete with the efficiency of FOSS.
They can: they eventually implement features that the open-source world
will copy and implement later in a more rudimentary form.
So where's the payware OS and software for, say, Raspbarry Pi?
Windows 11! Some people with way to much time on their hands kind of,
sort of got it to come up.

https://all3dp.com/2/install-windows-11-raspberry-pi-5/

I've got the CanaKit pictured and it's well worth the few extra bucks to
get the case, fan, power supply, and cables. It comes with Raspberry Pi OS
on a microSD which was good enough for me. I've got other projects than
see what other distros might work.
RonB
2024-12-20 05:54:35 UTC
Reply
Permalink
Post by CrudeSausage
Post by chrisv
Post by Lawrence D'Oliveiro
Linux is the only platform that offers a full suite of software native for
both ARM and x86 (both 32- and 64-bit). Nobody else does.
Payware vendors can't compete with the efficiency of FOSS.
They can: they eventually implement features that the open-source world
will copy and implement later in a more rudimentary form.
And vice versa.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-20 13:55:51 UTC
Reply
Permalink
Post by RonB
Post by CrudeSausage
Post by chrisv
Post by Lawrence D'Oliveiro
Linux is the only platform that offers a full suite of software native for
both ARM and x86 (both 32- and 64-bit). Nobody else does.
Payware vendors can't compete with the efficiency of FOSS.
They can: they eventually implement features that the open-source world
will copy and implement later in a more rudimentary form.
And vice versa.
Admittedly, "rudimentary" is the key word when talking about the Windows
Store. You can find software there, but it's rather poor in comparison
to the win32 stuff you'll find on the web. To think that Microsoft
offers you the option to only obtain software from there in Windows...
--
CrudeSausage
CrudeSausage
2024-12-19 22:14:38 UTC
Reply
Permalink
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Except that Apple already has a ton of the software running natively
on ARM ...
Sure, if Apple stuff is sufficient for you. Linux stuff runs natively
on ARM and a bunch of other architectures, as well.
It does, but Linux's stuff is already hard to sell on x86-64 so you can
imagine how alluring it is on ARM.
Linux is the only platform that offers a full suite of software native for
both ARM and x86 (both 32- and 64-bit). Nobody else does.
Agreed, that's why it is a much better choice for a non-Apple ARM
computer than Windows.
--
CrudeSausage
RonB
2024-12-18 11:09:06 UTC
Reply
Permalink
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
While I agree that most people want longer battery life for their laptops, I
really don't think the cost of charging a laptop is that big of a concern.
Not to a person who lives in an area where electricity is cheap.
However, it is only going to become more expensive in places like Europe
where its production depend on a resource acquired from Russia. The same
way they switched to fuel-efficient or electric cars to lower their
reliance on gasoline, they are probably going to switch to
energy-efficient machines to reduce their need for electricity altogether.
If things are getting that dire in Europe they're going to have to learn
to live without computers at all.
If this were the 80s and Europe were facing these issues, I imagine that
either Atari or Commodore would have produced a very efficient computer
which would only need to be charged once daily. Let's not forget how
popular the ST and the Amiga were over there while they were failing
miserably in North America. Because both companies are dead, the most
likely scenario is that they will move to the efficient machines made by
Apple or equipped with Qualcomm's processors. I do not think that their
energy crisis is going to get better anytime soon.
I'm sorry, but I'm skeptical that the electricity needed to charge a laptop
is that big of a concern, even in Europe.
In that case, you should look at how Germany's economy is tanking,
specifically the result of a lack of cheap oil coming in from Russia.
You can imagine that the smaller supply of oil will result in electrical
production being more expensive and for the power bills to be much
higher for the average German. As a result, they are not as likely as
they once might have been to buy the powerful PC which requires 800W of
power to play a game every hour.
I don't have to "imagine" that the lack of cheap Russian gas is hurting
Germany's economy (that's plain to see every day in the international news).
I'm just having trouble imagining that this is resulting in angst about the
amount of electricity required to charge a laptop.
If the price you pay for electricity doubles, you are likely to look at
the devices in your house and make changes in the kind of machine you
buy. The promise of charging once a day rather than keeping a machine
plugged is likely to be a benefit to a European. The people of North
America probably won't care as much since power is cheap here.
Hypotheticals. I'll remain skeptical that this will be a major issue.
(Unless, of course, there is no power at all — which may be a reality in
Europe if they keep going down the destructive paths they've chosen. In that
case keeping food from spoiling will probably take priority over laptop
charging — of any kind).
Only as long as whatever work you do doesn't depend on you having a
computer.
Post by RonB
Post by CrudeSausage
Post by RonB
I purposely use low power laptops and micro desktops because it's all I need
and I don't like the background sound of fans. These all run Intel CPUs
(except for the Wyse 5060 thin client desktop — it uses a low power AMD
CPU).
And, as usual, the standard disclaimer, I don't play Windows' video games or
use high-end (watt gobbling) GPUs. I'm not sure, though, that ARM chips will
be running these games in the future. (I guess we'll see.)
ARM might, but I don't care to stick around to find out. At best, I
would imagine that ARM will play today's games as well as today's x86-64
PCs around 2027 or so through some compatibility layer. If it happens
sooner, all the better.
I'm guessing the power required to run Windows complex video games will not
fit in ARM's low-power "wheelhouse." But we'll see. As I've mentioned (many
times now) I'm not a game player.
ARM being low-power doesn't mean that it is low-performance. As the
Apple processors have shown, they're a lot more powerful than x86-64
processors on single-core applications. They're only worse on multi-core
and even then, not by much. ARM basically allows people to have
performance like they currently have but through much less battery power.
I'll watch and see what happens. I don't anticipate getting an ARM laptop
(or desktop) in the near future — but then I don't anticipate buying any new
computers at all in the next ten years (or so).
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-18 14:09:08 UTC
Reply
Permalink
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
While I agree that most people want longer battery life for their laptops, I
really don't think the cost of charging a laptop is that big of a concern.
Not to a person who lives in an area where electricity is cheap.
However, it is only going to become more expensive in places like Europe
where its production depend on a resource acquired from Russia. The same
way they switched to fuel-efficient or electric cars to lower their
reliance on gasoline, they are probably going to switch to
energy-efficient machines to reduce their need for electricity altogether.
If things are getting that dire in Europe they're going to have to learn
to live without computers at all.
If this were the 80s and Europe were facing these issues, I imagine that
either Atari or Commodore would have produced a very efficient computer
which would only need to be charged once daily. Let's not forget how
popular the ST and the Amiga were over there while they were failing
miserably in North America. Because both companies are dead, the most
likely scenario is that they will move to the efficient machines made by
Apple or equipped with Qualcomm's processors. I do not think that their
energy crisis is going to get better anytime soon.
I'm sorry, but I'm skeptical that the electricity needed to charge a laptop
is that big of a concern, even in Europe.
In that case, you should look at how Germany's economy is tanking,
specifically the result of a lack of cheap oil coming in from Russia.
You can imagine that the smaller supply of oil will result in electrical
production being more expensive and for the power bills to be much
higher for the average German. As a result, they are not as likely as
they once might have been to buy the powerful PC which requires 800W of
power to play a game every hour.
I don't have to "imagine" that the lack of cheap Russian gas is hurting
Germany's economy (that's plain to see every day in the international news).
I'm just having trouble imagining that this is resulting in angst about the
amount of electricity required to charge a laptop.
If the price you pay for electricity doubles, you are likely to look at
the devices in your house and make changes in the kind of machine you
buy. The promise of charging once a day rather than keeping a machine
plugged is likely to be a benefit to a European. The people of North
America probably won't care as much since power is cheap here.
Hypotheticals. I'll remain skeptical that this will be a major issue.
(Unless, of course, there is no power at all — which may be a reality in
Europe if they keep going down the destructive paths they've chosen. In that
case keeping food from spoiling will probably take priority over laptop
charging — of any kind).
Only as long as whatever work you do doesn't depend on you having a
computer.
Post by RonB
Post by CrudeSausage
Post by RonB
I purposely use low power laptops and micro desktops because it's all I need
and I don't like the background sound of fans. These all run Intel CPUs
(except for the Wyse 5060 thin client desktop — it uses a low power AMD
CPU).
And, as usual, the standard disclaimer, I don't play Windows' video games or
use high-end (watt gobbling) GPUs. I'm not sure, though, that ARM chips will
be running these games in the future. (I guess we'll see.)
ARM might, but I don't care to stick around to find out. At best, I
would imagine that ARM will play today's games as well as today's x86-64
PCs around 2027 or so through some compatibility layer. If it happens
sooner, all the better.
I'm guessing the power required to run Windows complex video games will not
fit in ARM's low-power "wheelhouse." But we'll see. As I've mentioned (many
times now) I'm not a game player.
ARM being low-power doesn't mean that it is low-performance. As the
Apple processors have shown, they're a lot more powerful than x86-64
processors on single-core applications. They're only worse on multi-core
and even then, not by much. ARM basically allows people to have
performance like they currently have but through much less battery power.
I'll watch and see what happens. I don't anticipate getting an ARM laptop
(or desktop) in the near future — but then I don't anticipate buying any new
computers at all in the next ten years (or so).
I'm trying to hold onto the one I have for as long as possible too, but
I know that it's just a matter of time before the keyboard's keys stop
working as they should and the parts to fix issue stop being available.
When that happens, I'll have no choice but to get another one.
--
CrudeSausage
RonB
2024-12-19 08:03:51 UTC
Reply
Permalink
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
While I agree that most people want longer battery life for their laptops, I
really don't think the cost of charging a laptop is that big of a concern.
Not to a person who lives in an area where electricity is cheap.
However, it is only going to become more expensive in places like Europe
where its production depend on a resource acquired from Russia. The same
way they switched to fuel-efficient or electric cars to lower their
reliance on gasoline, they are probably going to switch to
energy-efficient machines to reduce their need for electricity altogether.
If things are getting that dire in Europe they're going to have to learn
to live without computers at all.
If this were the 80s and Europe were facing these issues, I imagine that
either Atari or Commodore would have produced a very efficient computer
which would only need to be charged once daily. Let's not forget how
popular the ST and the Amiga were over there while they were failing
miserably in North America. Because both companies are dead, the most
likely scenario is that they will move to the efficient machines made by
Apple or equipped with Qualcomm's processors. I do not think that their
energy crisis is going to get better anytime soon.
I'm sorry, but I'm skeptical that the electricity needed to charge a laptop
is that big of a concern, even in Europe.
In that case, you should look at how Germany's economy is tanking,
specifically the result of a lack of cheap oil coming in from Russia.
You can imagine that the smaller supply of oil will result in electrical
production being more expensive and for the power bills to be much
higher for the average German. As a result, they are not as likely as
they once might have been to buy the powerful PC which requires 800W of
power to play a game every hour.
I don't have to "imagine" that the lack of cheap Russian gas is hurting
Germany's economy (that's plain to see every day in the international news).
I'm just having trouble imagining that this is resulting in angst about the
amount of electricity required to charge a laptop.
If the price you pay for electricity doubles, you are likely to look at
the devices in your house and make changes in the kind of machine you
buy. The promise of charging once a day rather than keeping a machine
plugged is likely to be a benefit to a European. The people of North
America probably won't care as much since power is cheap here.
Hypotheticals. I'll remain skeptical that this will be a major issue.
(Unless, of course, there is no power at all — which may be a reality in
Europe if they keep going down the destructive paths they've chosen. In that
case keeping food from spoiling will probably take priority over laptop
charging — of any kind).
Only as long as whatever work you do doesn't depend on you having a
computer.
Post by RonB
Post by CrudeSausage
Post by RonB
I purposely use low power laptops and micro desktops because it's all I need
and I don't like the background sound of fans. These all run Intel CPUs
(except for the Wyse 5060 thin client desktop — it uses a low power AMD
CPU).
And, as usual, the standard disclaimer, I don't play Windows' video games or
use high-end (watt gobbling) GPUs. I'm not sure, though, that ARM chips will
be running these games in the future. (I guess we'll see.)
ARM might, but I don't care to stick around to find out. At best, I
would imagine that ARM will play today's games as well as today's x86-64
PCs around 2027 or so through some compatibility layer. If it happens
sooner, all the better.
I'm guessing the power required to run Windows complex video games will not
fit in ARM's low-power "wheelhouse." But we'll see. As I've mentioned (many
times now) I'm not a game player.
ARM being low-power doesn't mean that it is low-performance. As the
Apple processors have shown, they're a lot more powerful than x86-64
processors on single-core applications. They're only worse on multi-core
and even then, not by much. ARM basically allows people to have
performance like they currently have but through much less battery power.
I'll watch and see what happens. I don't anticipate getting an ARM laptop
(or desktop) in the near future — but then I don't anticipate buying any new
computers at all in the next ten years (or so).
I'm trying to hold onto the one I have for as long as possible too, but
I know that it's just a matter of time before the keyboard's keys stop
working as they should and the parts to fix issue stop being available.
When that happens, I'll have no choice but to get another one.
That's another advantage of Dell Latitudes. They made so many of them that
parts are widely available and cheap.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-19 14:00:52 UTC
Reply
Permalink
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
While I agree that most people want longer battery life for their laptops, I
really don't think the cost of charging a laptop is that big of a concern.
Not to a person who lives in an area where electricity is cheap.
However, it is only going to become more expensive in places like Europe
where its production depend on a resource acquired from Russia. The same
way they switched to fuel-efficient or electric cars to lower their
reliance on gasoline, they are probably going to switch to
energy-efficient machines to reduce their need for electricity altogether.
If things are getting that dire in Europe they're going to have to learn
to live without computers at all.
If this were the 80s and Europe were facing these issues, I imagine that
either Atari or Commodore would have produced a very efficient computer
which would only need to be charged once daily. Let's not forget how
popular the ST and the Amiga were over there while they were failing
miserably in North America. Because both companies are dead, the most
likely scenario is that they will move to the efficient machines made by
Apple or equipped with Qualcomm's processors. I do not think that their
energy crisis is going to get better anytime soon.
I'm sorry, but I'm skeptical that the electricity needed to charge a laptop
is that big of a concern, even in Europe.
In that case, you should look at how Germany's economy is tanking,
specifically the result of a lack of cheap oil coming in from Russia.
You can imagine that the smaller supply of oil will result in electrical
production being more expensive and for the power bills to be much
higher for the average German. As a result, they are not as likely as
they once might have been to buy the powerful PC which requires 800W of
power to play a game every hour.
I don't have to "imagine" that the lack of cheap Russian gas is hurting
Germany's economy (that's plain to see every day in the international news).
I'm just having trouble imagining that this is resulting in angst about the
amount of electricity required to charge a laptop.
If the price you pay for electricity doubles, you are likely to look at
the devices in your house and make changes in the kind of machine you
buy. The promise of charging once a day rather than keeping a machine
plugged is likely to be a benefit to a European. The people of North
America probably won't care as much since power is cheap here.
Hypotheticals. I'll remain skeptical that this will be a major issue.
(Unless, of course, there is no power at all — which may be a reality in
Europe if they keep going down the destructive paths they've chosen. In that
case keeping food from spoiling will probably take priority over laptop
charging — of any kind).
Only as long as whatever work you do doesn't depend on you having a
computer.
Post by RonB
Post by CrudeSausage
Post by RonB
I purposely use low power laptops and micro desktops because it's all I need
and I don't like the background sound of fans. These all run Intel CPUs
(except for the Wyse 5060 thin client desktop — it uses a low power AMD
CPU).
And, as usual, the standard disclaimer, I don't play Windows' video games or
use high-end (watt gobbling) GPUs. I'm not sure, though, that ARM chips will
be running these games in the future. (I guess we'll see.)
ARM might, but I don't care to stick around to find out. At best, I
would imagine that ARM will play today's games as well as today's x86-64
PCs around 2027 or so through some compatibility layer. If it happens
sooner, all the better.
I'm guessing the power required to run Windows complex video games will not
fit in ARM's low-power "wheelhouse." But we'll see. As I've mentioned (many
times now) I'm not a game player.
ARM being low-power doesn't mean that it is low-performance. As the
Apple processors have shown, they're a lot more powerful than x86-64
processors on single-core applications. They're only worse on multi-core
and even then, not by much. ARM basically allows people to have
performance like they currently have but through much less battery power.
I'll watch and see what happens. I don't anticipate getting an ARM laptop
(or desktop) in the near future — but then I don't anticipate buying any new
computers at all in the next ten years (or so).
I'm trying to hold onto the one I have for as long as possible too, but
I know that it's just a matter of time before the keyboard's keys stop
working as they should and the parts to fix issue stop being available.
When that happens, I'll have no choice but to get another one.
That's another advantage of Dell Latitudes. They made so many of them that
parts are widely available and cheap.
True, but those parts will probably only be found in landfills after a
while. The same way that it becomes difficult to find parts for cars
after five years, it becomes hard to find parts for laptops after about
three.
--
CrudeSausage
RonB
2024-12-20 05:52:44 UTC
Reply
Permalink
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
While I agree that most people want longer battery life for their laptops, I
really don't think the cost of charging a laptop is that big of a concern.
Not to a person who lives in an area where electricity is cheap.
However, it is only going to become more expensive in places like Europe
where its production depend on a resource acquired from Russia. The same
way they switched to fuel-efficient or electric cars to lower their
reliance on gasoline, they are probably going to switch to
energy-efficient machines to reduce their need for electricity altogether.
If things are getting that dire in Europe they're going to have to learn
to live without computers at all.
If this were the 80s and Europe were facing these issues, I imagine that
either Atari or Commodore would have produced a very efficient computer
which would only need to be charged once daily. Let's not forget how
popular the ST and the Amiga were over there while they were failing
miserably in North America. Because both companies are dead, the most
likely scenario is that they will move to the efficient machines made by
Apple or equipped with Qualcomm's processors. I do not think that their
energy crisis is going to get better anytime soon.
I'm sorry, but I'm skeptical that the electricity needed to charge a laptop
is that big of a concern, even in Europe.
In that case, you should look at how Germany's economy is tanking,
specifically the result of a lack of cheap oil coming in from Russia.
You can imagine that the smaller supply of oil will result in electrical
production being more expensive and for the power bills to be much
higher for the average German. As a result, they are not as likely as
they once might have been to buy the powerful PC which requires 800W of
power to play a game every hour.
I don't have to "imagine" that the lack of cheap Russian gas is hurting
Germany's economy (that's plain to see every day in the international news).
I'm just having trouble imagining that this is resulting in angst about the
amount of electricity required to charge a laptop.
If the price you pay for electricity doubles, you are likely to look at
the devices in your house and make changes in the kind of machine you
buy. The promise of charging once a day rather than keeping a machine
plugged is likely to be a benefit to a European. The people of North
America probably won't care as much since power is cheap here.
Hypotheticals. I'll remain skeptical that this will be a major issue.
(Unless, of course, there is no power at all — which may be a reality in
Europe if they keep going down the destructive paths they've chosen. In that
case keeping food from spoiling will probably take priority over laptop
charging — of any kind).
Only as long as whatever work you do doesn't depend on you having a
computer.
Post by RonB
Post by CrudeSausage
Post by RonB
I purposely use low power laptops and micro desktops because it's all I need
and I don't like the background sound of fans. These all run Intel CPUs
(except for the Wyse 5060 thin client desktop — it uses a low power AMD
CPU).
And, as usual, the standard disclaimer, I don't play Windows' video games or
use high-end (watt gobbling) GPUs. I'm not sure, though, that ARM chips will
be running these games in the future. (I guess we'll see.)
ARM might, but I don't care to stick around to find out. At best, I
would imagine that ARM will play today's games as well as today's x86-64
PCs around 2027 or so through some compatibility layer. If it happens
sooner, all the better.
I'm guessing the power required to run Windows complex video games will not
fit in ARM's low-power "wheelhouse." But we'll see. As I've mentioned (many
times now) I'm not a game player.
ARM being low-power doesn't mean that it is low-performance. As the
Apple processors have shown, they're a lot more powerful than x86-64
processors on single-core applications. They're only worse on multi-core
and even then, not by much. ARM basically allows people to have
performance like they currently have but through much less battery power.
I'll watch and see what happens. I don't anticipate getting an ARM laptop
(or desktop) in the near future — but then I don't anticipate buying any new
computers at all in the next ten years (or so).
I'm trying to hold onto the one I have for as long as possible too, but
I know that it's just a matter of time before the keyboard's keys stop
working as they should and the parts to fix issue stop being available.
When that happens, I'll have no choice but to get another one.
That's another advantage of Dell Latitudes. They made so many of them that
parts are widely available and cheap.
True, but those parts will probably only be found in landfills after a
while. The same way that it becomes difficult to find parts for cars
after five years, it becomes hard to find parts for laptops after about
three.
I don't know. I've played with a lot of old Dell Latitudes and I've always
managed to find the parts I need.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-20 13:54:25 UTC
Reply
Permalink
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
While I agree that most people want longer battery life for their laptops, I
really don't think the cost of charging a laptop is that big of a concern.
Not to a person who lives in an area where electricity is cheap.
However, it is only going to become more expensive in places like Europe
where its production depend on a resource acquired from Russia. The same
way they switched to fuel-efficient or electric cars to lower their
reliance on gasoline, they are probably going to switch to
energy-efficient machines to reduce their need for electricity altogether.
If things are getting that dire in Europe they're going to have to learn
to live without computers at all.
If this were the 80s and Europe were facing these issues, I imagine that
either Atari or Commodore would have produced a very efficient computer
which would only need to be charged once daily. Let's not forget how
popular the ST and the Amiga were over there while they were failing
miserably in North America. Because both companies are dead, the most
likely scenario is that they will move to the efficient machines made by
Apple or equipped with Qualcomm's processors. I do not think that their
energy crisis is going to get better anytime soon.
I'm sorry, but I'm skeptical that the electricity needed to charge a laptop
is that big of a concern, even in Europe.
In that case, you should look at how Germany's economy is tanking,
specifically the result of a lack of cheap oil coming in from Russia.
You can imagine that the smaller supply of oil will result in electrical
production being more expensive and for the power bills to be much
higher for the average German. As a result, they are not as likely as
they once might have been to buy the powerful PC which requires 800W of
power to play a game every hour.
I don't have to "imagine" that the lack of cheap Russian gas is hurting
Germany's economy (that's plain to see every day in the international news).
I'm just having trouble imagining that this is resulting in angst about the
amount of electricity required to charge a laptop.
If the price you pay for electricity doubles, you are likely to look at
the devices in your house and make changes in the kind of machine you
buy. The promise of charging once a day rather than keeping a machine
plugged is likely to be a benefit to a European. The people of North
America probably won't care as much since power is cheap here.
Hypotheticals. I'll remain skeptical that this will be a major issue.
(Unless, of course, there is no power at all — which may be a reality in
Europe if they keep going down the destructive paths they've chosen. In that
case keeping food from spoiling will probably take priority over laptop
charging — of any kind).
Only as long as whatever work you do doesn't depend on you having a
computer.
Post by RonB
Post by CrudeSausage
Post by RonB
I purposely use low power laptops and micro desktops because it's all I need
and I don't like the background sound of fans. These all run Intel CPUs
(except for the Wyse 5060 thin client desktop — it uses a low power AMD
CPU).
And, as usual, the standard disclaimer, I don't play Windows' video games or
use high-end (watt gobbling) GPUs. I'm not sure, though, that ARM chips will
be running these games in the future. (I guess we'll see.)
ARM might, but I don't care to stick around to find out. At best, I
would imagine that ARM will play today's games as well as today's x86-64
PCs around 2027 or so through some compatibility layer. If it happens
sooner, all the better.
I'm guessing the power required to run Windows complex video games will not
fit in ARM's low-power "wheelhouse." But we'll see. As I've mentioned (many
times now) I'm not a game player.
ARM being low-power doesn't mean that it is low-performance. As the
Apple processors have shown, they're a lot more powerful than x86-64
processors on single-core applications. They're only worse on multi-core
and even then, not by much. ARM basically allows people to have
performance like they currently have but through much less battery power.
I'll watch and see what happens. I don't anticipate getting an ARM laptop
(or desktop) in the near future — but then I don't anticipate buying any new
computers at all in the next ten years (or so).
I'm trying to hold onto the one I have for as long as possible too, but
I know that it's just a matter of time before the keyboard's keys stop
working as they should and the parts to fix issue stop being available.
When that happens, I'll have no choice but to get another one.
That's another advantage of Dell Latitudes. They made so many of them that
parts are widely available and cheap.
True, but those parts will probably only be found in landfills after a
while. The same way that it becomes difficult to find parts for cars
after five years, it becomes hard to find parts for laptops after about
three.
I don't know. I've played with a lot of old Dell Latitudes and I've always
managed to find the parts I need.
I guess the Dell Latitude is the Model T of computers.
--
CrudeSausage
RonB
2024-12-20 18:20:14 UTC
Reply
Permalink
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
While I agree that most people want longer battery life for their laptops, I
really don't think the cost of charging a laptop is that big of a concern.
Not to a person who lives in an area where electricity is cheap.
However, it is only going to become more expensive in places like Europe
where its production depend on a resource acquired from Russia. The same
way they switched to fuel-efficient or electric cars to lower their
reliance on gasoline, they are probably going to switch to
energy-efficient machines to reduce their need for electricity altogether.
If things are getting that dire in Europe they're going to have to learn
to live without computers at all.
If this were the 80s and Europe were facing these issues, I imagine that
either Atari or Commodore would have produced a very efficient computer
which would only need to be charged once daily. Let's not forget how
popular the ST and the Amiga were over there while they were failing
miserably in North America. Because both companies are dead, the most
likely scenario is that they will move to the efficient machines made by
Apple or equipped with Qualcomm's processors. I do not think that their
energy crisis is going to get better anytime soon.
I'm sorry, but I'm skeptical that the electricity needed to charge a laptop
is that big of a concern, even in Europe.
In that case, you should look at how Germany's economy is tanking,
specifically the result of a lack of cheap oil coming in from Russia.
You can imagine that the smaller supply of oil will result in electrical
production being more expensive and for the power bills to be much
higher for the average German. As a result, they are not as likely as
they once might have been to buy the powerful PC which requires 800W of
power to play a game every hour.
I don't have to "imagine" that the lack of cheap Russian gas is hurting
Germany's economy (that's plain to see every day in the international news).
I'm just having trouble imagining that this is resulting in angst about the
amount of electricity required to charge a laptop.
If the price you pay for electricity doubles, you are likely to look at
the devices in your house and make changes in the kind of machine you
buy. The promise of charging once a day rather than keeping a machine
plugged is likely to be a benefit to a European. The people of North
America probably won't care as much since power is cheap here.
Hypotheticals. I'll remain skeptical that this will be a major issue.
(Unless, of course, there is no power at all — which may be a reality in
Europe if they keep going down the destructive paths they've chosen. In that
case keeping food from spoiling will probably take priority over laptop
charging — of any kind).
Only as long as whatever work you do doesn't depend on you having a
computer.
Post by RonB
Post by CrudeSausage
Post by RonB
I purposely use low power laptops and micro desktops because it's all I need
and I don't like the background sound of fans. These all run Intel CPUs
(except for the Wyse 5060 thin client desktop — it uses a low power AMD
CPU).
And, as usual, the standard disclaimer, I don't play Windows' video games or
use high-end (watt gobbling) GPUs. I'm not sure, though, that ARM chips will
be running these games in the future. (I guess we'll see.)
ARM might, but I don't care to stick around to find out. At best, I
would imagine that ARM will play today's games as well as today's x86-64
PCs around 2027 or so through some compatibility layer. If it happens
sooner, all the better.
I'm guessing the power required to run Windows complex video games will not
fit in ARM's low-power "wheelhouse." But we'll see. As I've mentioned (many
times now) I'm not a game player.
ARM being low-power doesn't mean that it is low-performance. As the
Apple processors have shown, they're a lot more powerful than x86-64
processors on single-core applications. They're only worse on multi-core
and even then, not by much. ARM basically allows people to have
performance like they currently have but through much less battery power.
I'll watch and see what happens. I don't anticipate getting an ARM laptop
(or desktop) in the near future — but then I don't anticipate buying any new
computers at all in the next ten years (or so).
I'm trying to hold onto the one I have for as long as possible too, but
I know that it's just a matter of time before the keyboard's keys stop
working as they should and the parts to fix issue stop being available.
When that happens, I'll have no choice but to get another one.
That's another advantage of Dell Latitudes. They made so many of them that
parts are widely available and cheap.
True, but those parts will probably only be found in landfills after a
while. The same way that it becomes difficult to find parts for cars
after five years, it becomes hard to find parts for laptops after about
three.
I don't know. I've played with a lot of old Dell Latitudes and I've always
managed to find the parts I need.
I guess the Dell Latitude is the Model T of computers.
Popular and plentiful at any rate. I don't know about the newest ones.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
rbowman
2024-12-18 19:36:24 UTC
Reply
Permalink
Post by RonB
I'll watch and see what happens. I don't anticipate getting an ARM
laptop (or desktop) in the near future — but then I don't anticipate
buying any new computers at all in the next ten years (or so).
Go head and spurge. Get a Raspberry Pi 5 and you too can have a ARM
desktop. When I cycle through the KVM switch the R Pi looks a lot like the
Fedora or Ubuntu box without some of the cruft. It has Chromium, Firefox,
and VS Code like the others, as well as Thonny and Mu. No mahjongg though.
RonB
2024-12-19 08:05:21 UTC
Reply
Permalink
Post by rbowman
Post by RonB
I'll watch and see what happens. I don't anticipate getting an ARM
laptop (or desktop) in the near future — but then I don't anticipate
buying any new computers at all in the next ten years (or so).
Go head and spurge. Get a Raspberry Pi 5 and you too can have a ARM
desktop. When I cycle through the KVM switch the R Pi looks a lot like the
Fedora or Ubuntu box without some of the cruft. It has Chromium, Firefox,
and VS Code like the others, as well as Thonny and Mu. No mahjongg though.
I don't know if any of the applications I would want to use run on ARM
though.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-19 14:02:00 UTC
Reply
Permalink
Post by RonB
Post by rbowman
Post by RonB
I'll watch and see what happens. I don't anticipate getting an ARM
laptop (or desktop) in the near future — but then I don't anticipate
buying any new computers at all in the next ten years (or so).
Go head and spurge. Get a Raspberry Pi 5 and you too can have a ARM
desktop. When I cycle through the KVM switch the R Pi looks a lot like the
Fedora or Ubuntu box without some of the cruft. It has Chromium, Firefox,
and VS Code like the others, as well as Thonny and Mu. No mahjongg though.
I don't know if any of the applications I would want to use run on ARM
though.
I'm fairly sure that the entirety of Linux software is already available
for ARM unlike Windows software. That is why Linux is far and beyond the
better option for those Snapdragon-equipped computers currently being sold.
--
CrudeSausage
rbowman
2024-12-19 19:58:42 UTC
Reply
Permalink
Post by RonB
I don't know if any of the applications I would want to use run on ARM
though.
I haven't researched on how much is available. Raspberry Pi OS is a Debian
fork although some people have put Ubuntu and other distros on the Pi.
rbowman
2024-12-17 20:22:32 UTC
Reply
Permalink
Post by RonB
I don't have to "imagine" that the lack of cheap Russian gas is hurting
Germany's economy (that's plain to see every day in the international news).
I'm just having trouble imagining that this is resulting in angst about
the amount of electricity required to charge a laptop.
There were other factors like immigration but Scholz's brilliant policies
got him a vote of no confidence and the government failed. February will
be interesting.
RonB
2024-12-18 11:05:41 UTC
Reply
Permalink
Post by rbowman
Post by RonB
I don't have to "imagine" that the lack of cheap Russian gas is hurting
Germany's economy (that's plain to see every day in the international news).
I'm just having trouble imagining that this is resulting in angst about
the amount of electricity required to charge a laptop.
There were other factors like immigration but Scholz's brilliant policies
got him a vote of no confidence and the government failed. February will
be interesting.
That Scholz always looks like he's confused. I think Trudeau in Canada
might be in trouble as well (also constantly confused).
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-18 14:06:34 UTC
Reply
Permalink
Post by RonB
Post by rbowman
Post by RonB
I don't have to "imagine" that the lack of cheap Russian gas is hurting
Germany's economy (that's plain to see every day in the international news).
I'm just having trouble imagining that this is resulting in angst about
the amount of electricity required to charge a laptop.
There were other factors like immigration but Scholz's brilliant policies
got him a vote of no confidence and the government failed. February will
be interesting.
That Scholz always looks like he's confused. I think Trudeau in Canada
might be in trouble as well (also constantly confused).
The media here is strongly suggesting that Trudeau will resign. They are
saying that there are too many factors around him pushing him in that
direction and that a January election is very likely. Of course, they
are underestimating Trudeau's self-awareness and therefore his knowledge
of how useless he is to the world if he loses the ultimate job. He has
no knowledge of law, he has shown himself to be incompetent on the
economic front and there is no greater example of a complete lack of
leadership. He would also be unwilling to move up the ladder of the
education system if he wanted to use the one degree he does have since
he was once heading the country. In other words, if he steps down, he
loses the Prime Minister's salary as well as the taxpayer-funded piggy
bank he's been using for personal matters. Plus, how is he supposed to
live without access to a private jet to see all his friends around the
world?!

That faggot is going to hold onto power for dear life. He will have to
be forcibly removed from office when he loses the election he will be
forced to call at the end of 2025. Even when he loses and the
Conservatives get a majority, he will try something to keep himself in
power. He is that much of a tyrant.
--
CrudeSausage
RonB
2024-12-19 08:00:23 UTC
Reply
Permalink
Post by CrudeSausage
Post by RonB
Post by rbowman
Post by RonB
I don't have to "imagine" that the lack of cheap Russian gas is hurting
Germany's economy (that's plain to see every day in the international news).
I'm just having trouble imagining that this is resulting in angst about
the amount of electricity required to charge a laptop.
There were other factors like immigration but Scholz's brilliant policies
got him a vote of no confidence and the government failed. February will
be interesting.
That Scholz always looks like he's confused. I think Trudeau in Canada
might be in trouble as well (also constantly confused).
The media here is strongly suggesting that Trudeau will resign. They are
saying that there are too many factors around him pushing him in that
direction and that a January election is very likely. Of course, they
are underestimating Trudeau's self-awareness and therefore his knowledge
of how useless he is to the world if he loses the ultimate job. He has
no knowledge of law, he has shown himself to be incompetent on the
economic front and there is no greater example of a complete lack of
leadership. He would also be unwilling to move up the ladder of the
education system if he wanted to use the one degree he does have since
he was once heading the country. In other words, if he steps down, he
loses the Prime Minister's salary as well as the taxpayer-funded piggy
bank he's been using for personal matters. Plus, how is he supposed to
live without access to a private jet to see all his friends around the
world?!
That faggot is going to hold onto power for dear life. He will have to
be forcibly removed from office when he loses the election he will be
forced to call at the end of 2025. Even when he loses and the
Conservatives get a majority, he will try something to keep himself in
power. He is that much of a tyrant.
I think he and Macron are twin sons of different mothers. Both idiots.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-19 13:59:39 UTC
Reply
Permalink
Post by RonB
Post by CrudeSausage
Post by RonB
Post by rbowman
Post by RonB
I don't have to "imagine" that the lack of cheap Russian gas is hurting
Germany's economy (that's plain to see every day in the international news).
I'm just having trouble imagining that this is resulting in angst about
the amount of electricity required to charge a laptop.
There were other factors like immigration but Scholz's brilliant policies
got him a vote of no confidence and the government failed. February will
be interesting.
That Scholz always looks like he's confused. I think Trudeau in Canada
might be in trouble as well (also constantly confused).
The media here is strongly suggesting that Trudeau will resign. They are
saying that there are too many factors around him pushing him in that
direction and that a January election is very likely. Of course, they
are underestimating Trudeau's self-awareness and therefore his knowledge
of how useless he is to the world if he loses the ultimate job. He has
no knowledge of law, he has shown himself to be incompetent on the
economic front and there is no greater example of a complete lack of
leadership. He would also be unwilling to move up the ladder of the
education system if he wanted to use the one degree he does have since
he was once heading the country. In other words, if he steps down, he
loses the Prime Minister's salary as well as the taxpayer-funded piggy
bank he's been using for personal matters. Plus, how is he supposed to
live without access to a private jet to see all his friends around the
world?!
That faggot is going to hold onto power for dear life. He will have to
be forcibly removed from office when he loses the election he will be
forced to call at the end of 2025. Even when he loses and the
Conservatives get a majority, he will try something to keep himself in
power. He is that much of a tyrant.
I think he and Macron are twin sons of different mothers. Both idiots.
I think that they're desperately holding onto power because they have
been tasked with promoting Schwab's World Economic Forum agenda in their
respective countries and know that it would be dead in the water if they
left. Conservative politics are incompatible with the stakeholder
capitalism (Communism) the World Economic Forum is promoting so once
they resign, Schwab loses a good part of his power.
--
CrudeSausage
RonB
2024-12-20 05:51:17 UTC
Reply
Permalink
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by rbowman
Post by RonB
I don't have to "imagine" that the lack of cheap Russian gas is hurting
Germany's economy (that's plain to see every day in the international news).
I'm just having trouble imagining that this is resulting in angst about
the amount of electricity required to charge a laptop.
There were other factors like immigration but Scholz's brilliant policies
got him a vote of no confidence and the government failed. February will
be interesting.
That Scholz always looks like he's confused. I think Trudeau in Canada
might be in trouble as well (also constantly confused).
The media here is strongly suggesting that Trudeau will resign. They are
saying that there are too many factors around him pushing him in that
direction and that a January election is very likely. Of course, they
are underestimating Trudeau's self-awareness and therefore his knowledge
of how useless he is to the world if he loses the ultimate job. He has
no knowledge of law, he has shown himself to be incompetent on the
economic front and there is no greater example of a complete lack of
leadership. He would also be unwilling to move up the ladder of the
education system if he wanted to use the one degree he does have since
he was once heading the country. In other words, if he steps down, he
loses the Prime Minister's salary as well as the taxpayer-funded piggy
bank he's been using for personal matters. Plus, how is he supposed to
live without access to a private jet to see all his friends around the
world?!
That faggot is going to hold onto power for dear life. He will have to
be forcibly removed from office when he loses the election he will be
forced to call at the end of 2025. Even when he loses and the
Conservatives get a majority, he will try something to keep himself in
power. He is that much of a tyrant.
I think he and Macron are twin sons of different mothers. Both idiots.
I think that they're desperately holding onto power because they have
been tasked with promoting Schwab's World Economic Forum agenda in their
respective countries and know that it would be dead in the water if they
left. Conservative politics are incompatible with the stakeholder
capitalism (Communism) the World Economic Forum is promoting so once
they resign, Schwab loses a good part of his power.
I think Macron is one of the less intelligent Rothschilds. So he can
probably stay as long he wants. I guess Treadle is living off his father's
name. I would like to run down Canada for putting up with him, but we just
put up with four years of a senile pervert — so I've got no ground to stand
on.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-20 13:53:03 UTC
Reply
Permalink
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by rbowman
Post by RonB
I don't have to "imagine" that the lack of cheap Russian gas is hurting
Germany's economy (that's plain to see every day in the international news).
I'm just having trouble imagining that this is resulting in angst about
the amount of electricity required to charge a laptop.
There were other factors like immigration but Scholz's brilliant policies
got him a vote of no confidence and the government failed. February will
be interesting.
That Scholz always looks like he's confused. I think Trudeau in Canada
might be in trouble as well (also constantly confused).
The media here is strongly suggesting that Trudeau will resign. They are
saying that there are too many factors around him pushing him in that
direction and that a January election is very likely. Of course, they
are underestimating Trudeau's self-awareness and therefore his knowledge
of how useless he is to the world if he loses the ultimate job. He has
no knowledge of law, he has shown himself to be incompetent on the
economic front and there is no greater example of a complete lack of
leadership. He would also be unwilling to move up the ladder of the
education system if he wanted to use the one degree he does have since
he was once heading the country. In other words, if he steps down, he
loses the Prime Minister's salary as well as the taxpayer-funded piggy
bank he's been using for personal matters. Plus, how is he supposed to
live without access to a private jet to see all his friends around the
world?!
That faggot is going to hold onto power for dear life. He will have to
be forcibly removed from office when he loses the election he will be
forced to call at the end of 2025. Even when he loses and the
Conservatives get a majority, he will try something to keep himself in
power. He is that much of a tyrant.
I think he and Macron are twin sons of different mothers. Both idiots.
I think that they're desperately holding onto power because they have
been tasked with promoting Schwab's World Economic Forum agenda in their
respective countries and know that it would be dead in the water if they
left. Conservative politics are incompatible with the stakeholder
capitalism (Communism) the World Economic Forum is promoting so once
they resign, Schwab loses a good part of his power.
I think Macron is one of the less intelligent Rothschilds. So he can
probably stay as long he wants. I guess Treadle is living off his father's
name. I would like to run down Canada for putting up with him, but we just
put up with four years of a senile pervert — so I've got no ground to stand
on.
The sad part is that Trudeau is indeed living off his father's name. The
problem is that the people who loved Pierre Elliot fondly remember
Trudeaumania in the late 60s and the "hope" that he brought with him.
They don't remember the result of him being elected: massive debts and
the devaluing of our dollar. The name Trudeau should be synonymous with
incompetence, but Canadians have a short memory just like Americans and
have for long been way too naive in believing the media that promoted
the interests of the progressives.
--
CrudeSausage
rbowman
2024-12-18 19:43:28 UTC
Reply
Permalink
Post by RonB
That Scholz always looks like he's confused. I think Trudeau in Canada
might be in trouble as well (also constantly confused).
Mutti Merkel left him a bag of shit and he wasn't the right person to
handle it. Besides he was the oldest chancellor in 50 years at the ripe
old age of 63. (ironic reference to the US preference for geriatric
cases)
Physfitfreak
2024-12-19 02:01:20 UTC
Reply
Permalink
Post by rbowman
Post by RonB
That Scholz always looks like he's confused. I think Trudeau in Canada
might be in trouble as well (also constantly confused).
Mutti Merkel left him a bag of shit and he wasn't the right person to
handle it. Besides he was the oldest chancellor in 50 years at the ripe
old age of 63. (ironic reference to the US preference for geriatric
cases)
63? He looks like 83. I guess beer does that to them.
RonB
2024-12-19 08:02:03 UTC
Reply
Permalink
Post by rbowman
Post by RonB
That Scholz always looks like he's confused. I think Trudeau in Canada
might be in trouble as well (also constantly confused).
Mutti Merkel left him a bag of shit and he wasn't the right person to
handle it. Besides he was the oldest chancellor in 50 years at the ripe
old age of 63. (ironic reference to the US preference for geriatric
cases)
He just seems clueless about how dire the situation is in Germany. Are
politicians REQUIRED to have a sub-standard IQ these days?
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
rbowman
2024-12-19 20:00:06 UTC
Reply
Permalink
Post by RonB
He just seems clueless about how dire the situation is in Germany. Are
politicians REQUIRED to have a sub-standard IQ these days?
We do seem to be far from Kennedy's best and brightest dream of Camelot.
rbowman
2024-12-14 20:04:33 UTC
Reply
Permalink
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
My existing laptop is capable of that but I don't watch videos or play
games in a day's work.
CrudeSausage
2024-12-15 13:47:26 UTC
Reply
Permalink
Post by rbowman
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
My existing laptop is capable of that but I don't watch videos or play
games in a day's work.
Mine can manage half of a work day too if I don't share my screen with
the classroom's projector and don't watch videos. I say this because
sharing automatically enables the GPU which multiplies the battery
discharge. If I just kept it on my desk to send e-mails, browse the web
and submit an assessment of a student, I might be able to get a day's
worth of work from a single charge. The MacBook Air I had would have
done it without a doubt though.

That's part of why I am saying that going forward, especially if
Microsoft moves fully to ARM, I'll only be using a Mac. I truly don't
believe the corporation to be capable of doing anything but completely
fucking up a transition to a new architecture.
--
CrudeSausage
Lawrence D'Oliveiro
2024-12-14 22:16:02 UTC
Reply
Permalink
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
Sure. But Windows can never give it to them.
CrudeSausage
2024-12-15 13:49:22 UTC
Reply
Permalink
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
Sure. But Windows can never give it to them.
It can and it already does on Snapdragon offerings.
--
CrudeSausage
Lawrence D'Oliveiro
2024-12-15 22:24:53 UTC
Reply
Permalink
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Higher performance per watt which leads to lower power use and
therefore improved battery life. Whether Intel and AMD want to admit
it or not, people _do_ want to have a computer which can handle a
whole day's work on a single charge and which won't increase
electrical bills.
Sure. But Windows can never give it to them.
It can and it already does on Snapdragon offerings.
Only with ARM-native code, of which there is precious little, with no sign
of the situation improving.
CrudeSausage
2024-12-16 01:02:51 UTC
Reply
Permalink
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Higher performance per watt which leads to lower power use and
therefore improved battery life. Whether Intel and AMD want to admit
it or not, people _do_ want to have a computer which can handle a
whole day's work on a single charge and which won't increase
electrical bills.
Sure. But Windows can never give it to them.
It can and it already does on Snapdragon offerings.
Only with ARM-native code, of which there is precious little, with no sign
of the situation improving.
Which is part of why I am suggesting that anyone interested in using an
ARM-equipped machine shouldn't hold their breath that Microsoft will do
a decent job and simply go straight to Apple. I am very much interested
in ARM machines which is why the next laptop is likely to be an Apple.
--
CrudeSausage
RonB
2024-12-16 10:29:05 UTC
Reply
Permalink
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Higher performance per watt which leads to lower power use and
therefore improved battery life. Whether Intel and AMD want to admit
it or not, people _do_ want to have a computer which can handle a
whole day's work on a single charge and which won't increase
electrical bills.
Sure. But Windows can never give it to them.
It can and it already does on Snapdragon offerings.
Only with ARM-native code, of which there is precious little, with no sign
of the situation improving.
Which is part of why I am suggesting that anyone interested in using an
ARM-equipped machine shouldn't hold their breath that Microsoft will do
a decent job and simply go straight to Apple. I am very much interested
in ARM machines which is why the next laptop is likely to be an Apple.
But only if they play Windows video games?
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-16 16:02:53 UTC
Reply
Permalink
Post by RonB
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Higher performance per watt which leads to lower power use and
therefore improved battery life. Whether Intel and AMD want to admit
it or not, people _do_ want to have a computer which can handle a
whole day's work on a single charge and which won't increase
electrical bills.
Sure. But Windows can never give it to them.
It can and it already does on Snapdragon offerings.
Only with ARM-native code, of which there is precious little, with no sign
of the situation improving.
Which is part of why I am suggesting that anyone interested in using an
ARM-equipped machine shouldn't hold their breath that Microsoft will do
a decent job and simply go straight to Apple. I am very much interested
in ARM machines which is why the next laptop is likely to be an Apple.
But only if they play Windows video games?
I'm getting to a point where the only thing I want to play is NHL, and
that game is console-only. I don't have the time or energy to play those
3D games anymore.
--
CrudeSausage
RonB
2024-12-17 08:19:12 UTC
Reply
Permalink
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Higher performance per watt which leads to lower power use and
therefore improved battery life. Whether Intel and AMD want to admit
it or not, people _do_ want to have a computer which can handle a
whole day's work on a single charge and which won't increase
electrical bills.
Sure. But Windows can never give it to them.
It can and it already does on Snapdragon offerings.
Only with ARM-native code, of which there is precious little, with no sign
of the situation improving.
Which is part of why I am suggesting that anyone interested in using an
ARM-equipped machine shouldn't hold their breath that Microsoft will do
a decent job and simply go straight to Apple. I am very much interested
in ARM machines which is why the next laptop is likely to be an Apple.
But only if they play Windows video games?
I'm getting to a point where the only thing I want to play is NHL, and
that game is console-only. I don't have the time or energy to play those
3D games anymore.
My kids have game consoles, but still want to play Windows video games. They
occasionally try to get me interested in some of them — but I guess I'm just
too old. The last video game I was fairly good at was Pong. About all I play
on my computer is Solitaire or Mahjong, and Pinochle on the smartphone.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-17 14:01:39 UTC
Reply
Permalink
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Higher performance per watt which leads to lower power use and
therefore improved battery life. Whether Intel and AMD want to admit
it or not, people _do_ want to have a computer which can handle a
whole day's work on a single charge and which won't increase
electrical bills.
Sure. But Windows can never give it to them.
It can and it already does on Snapdragon offerings.
Only with ARM-native code, of which there is precious little, with no sign
of the situation improving.
Which is part of why I am suggesting that anyone interested in using an
ARM-equipped machine shouldn't hold their breath that Microsoft will do
a decent job and simply go straight to Apple. I am very much interested
in ARM machines which is why the next laptop is likely to be an Apple.
But only if they play Windows video games?
I'm getting to a point where the only thing I want to play is NHL, and
that game is console-only. I don't have the time or energy to play those
3D games anymore.
My kids have game consoles, but still want to play Windows video games. They
occasionally try to get me interested in some of them — but I guess I'm just
too old. The last video game I was fairly good at was Pong. About all I play
on my computer is Solitaire or Mahjong, and Pinochle on the smartphone.
The appeal of playing a game on the PC is the ability to play anywhere
through a laptop. If they're trying to do it on their desktop, I don't
see the benefit except for greater graphics than what a console can
handle. I actually find that fairly ridiculous since both the PS5 and
the Xbox Series X plays games natively in 4k. I'm not sure why a gamer
would need more than that.

There is also the benefit of games being cheaper on the PC since you're
not stuck to one marketplace. After all, if Steam is selling your title
for $50 but you can get it at the Epic Game Store for $25, why wouldn't
you?
--
CrudeSausage
RonB
2024-12-17 20:59:59 UTC
Reply
Permalink
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Higher performance per watt which leads to lower power use and
therefore improved battery life. Whether Intel and AMD want to admit
it or not, people _do_ want to have a computer which can handle a
whole day's work on a single charge and which won't increase
electrical bills.
Sure. But Windows can never give it to them.
It can and it already does on Snapdragon offerings.
Only with ARM-native code, of which there is precious little, with no sign
of the situation improving.
Which is part of why I am suggesting that anyone interested in using an
ARM-equipped machine shouldn't hold their breath that Microsoft will do
a decent job and simply go straight to Apple. I am very much interested
in ARM machines which is why the next laptop is likely to be an Apple.
But only if they play Windows video games?
I'm getting to a point where the only thing I want to play is NHL, and
that game is console-only. I don't have the time or energy to play those
3D games anymore.
My kids have game consoles, but still want to play Windows video games. They
occasionally try to get me interested in some of them — but I guess I'm just
too old. The last video game I was fairly good at was Pong. About all I play
on my computer is Solitaire or Mahjong, and Pinochle on the smartphone.
The appeal of playing a game on the PC is the ability to play anywhere
through a laptop. If they're trying to do it on their desktop, I don't
see the benefit except for greater graphics than what a console can
handle. I actually find that fairly ridiculous since both the PS5 and
the Xbox Series X plays games natively in 4k. I'm not sure why a gamer
would need more than that.
"Anywhere" for these kids is basically at home. They have an older brother
who uses a gaming laptop (not nearly into games as much as they are), but
they kind of feel "sorry" for him — like he's got an inferior experience.
Post by CrudeSausage
There is also the benefit of games being cheaper on the PC since you're
not stuck to one marketplace. After all, if Steam is selling your title
for $50 but you can get it at the Epic Game Store for $25, why wouldn't
you?
I guess I can see that point.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
rbowman
2024-12-17 20:20:17 UTC
Reply
Permalink
Post by RonB
My kids have game consoles, but still want to play Windows video games.
They occasionally try to get me interested in some of them — but I guess
I'm just too old. The last video game I was fairly good at was Pong.
About all I play on my computer is Solitaire or Mahjong, and Pinochle on
the smartphone.
I made the mistake of opening Mahjongg and got hooked. I like the Debian/
Ubuntu version better than KMahjongg on the Fedora box.

There seems to be a AARP online version with leaderboards but I haven't
played it.
RonB
2024-12-18 11:03:33 UTC
Reply
Permalink
Post by rbowman
Post by RonB
My kids have game consoles, but still want to play Windows video games.
They occasionally try to get me interested in some of them — but I guess
I'm just too old. The last video game I was fairly good at was Pong.
About all I play on my computer is Solitaire or Mahjong, and Pinochle on
the smartphone.
I made the mistake of opening Mahjongg and got hooked. I like the Debian/
Ubuntu version better than KMahjongg on the Fedora box.
There seems to be a AARP online version with leaderboards but I haven't
played it.
I like the KMahjongg version (but I don't know if I've tried the Ubuntu
version). I always play the "Well" board configuration. I think I've beaten
it three or four times total — maybe five times. Not a lot.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
rbowman
2024-12-18 20:17:42 UTC
Reply
Permalink
Post by RonB
I like the KMahjongg version (but I don't know if I've tried the Ubuntu
version). I always play the "Well" board configuration. I think I've
beaten it three or four times total — maybe five times. Not a lot.
That would take some getting used to. The 'Difficult' on Ubuntu is about
the same as the 'Default' on Fedora. The perspective is enough different
that I missed play since the tile didn't look playable,

I'd also have to experiment with the tile selection. The hint showed
several matches where the patterns weren't the same. I beat the Ubuntu
version fairly regularly. I'm not sure if all layouts are supposed to have
a solution. There are times when you have three of the same tiles exposed,
which two to remove?
rbowman
2024-12-19 05:33:27 UTC
Reply
Permalink
Post by rbowman
That would take some getting used to. The 'Difficult' on Ubuntu is about
the same as the 'Default' on Fedora. The perspective is enough different
that I missed play since the tile didn't look playable,
Follow up: I beat the Default twice. I still don't know what's up with the
tiles that look like flowers where what matches isn't the same. I have to
click on them and see if it goes.
RonB
2024-12-19 07:57:46 UTC
Reply
Permalink
Post by rbowman
Post by rbowman
That would take some getting used to. The 'Difficult' on Ubuntu is about
the same as the 'Default' on Fedora. The perspective is enough different
that I missed play since the tile didn't look playable,
Follow up: I beat the Default twice. I still don't know what's up with the
tiles that look like flowers where what matches isn't the same. I have to
click on them and see if it goes.
You've got some in pots and some not in pots. I think that's basically the
only difference.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
RonB
2024-12-19 07:47:57 UTC
Reply
Permalink
Post by rbowman
Post by RonB
I like the KMahjongg version (but I don't know if I've tried the Ubuntu
version). I always play the "Well" board configuration. I think I've
beaten it three or four times total — maybe five times. Not a lot.
That would take some getting used to. The 'Difficult' on Ubuntu is about
the same as the 'Default' on Fedora. The perspective is enough different
that I missed play since the tile didn't look playable,
I'd also have to experiment with the tile selection. The hint showed
several matches where the patterns weren't the same. I beat the Ubuntu
version fairly regularly. I'm not sure if all layouts are supposed to have
a solution. There are times when you have three of the same tiles exposed,
which two to remove?
I'm guessing you can always win (as they let your re-run the latest puzzle,
which I never do) — but you would have to very lucky.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
rbowman
2024-12-19 20:01:39 UTC
Reply
Permalink
Post by RonB
I'm guessing you can always win (as they let your re-run the latest puzzle,
which I never do) — but you would have to very lucky.
I did that several times by mistake by clicking the wrong button until the
dime dropped that the layout looked familiar.
Physfitfreak
2024-12-19 07:53:22 UTC
Reply
Permalink
Post by rbowman
I'd also have to experiment with the tile selection. The hint showed
several matches where the patterns weren't the same. I beat the Ubuntu
version fairly regularly. I'm not sure if all layouts are supposed to have
a solution. There are times when you have three of the same tiles exposed,
which two to remove?
I tried that game about 10 or 12 years back. Found it quite childish.

Why I tried it? Cause some new girl in the warehouse said she spent
hours each day at home playing it, and I wanted to get an idea what she
was and where I could use her in that job.

Those peaple play backgammon too. Part of it _must_ be chance cause
they're Penis X grade people after all.

But play chess, and everything is in your control. You lose, you really
lose. You win, you really win :) I still use my 45 year old Radio Shack
chessboard. Chess enthusiast friends are all gone. When in the mood, I
try yet one more time to beat its level 7. I've never been able to.
Level 6 is the highest I ever could beat it, like 10% of the time.

Level 7 is so impossible to beat that I'm thinking its scale of 1 to 9
is not even logarithmic. I wish I knew how they determined it in the
codes. If it was logarithmic, then on the average about one time in 100
tries I would be able to beat it in level 7. But I have tried a few
thousand times since I was in my 20s, in that level, without one win.

My own play level is pretty good and I still play a couple times a week,
cause who knows :)
DFS
2024-12-20 14:29:37 UTC
Reply
Permalink
Post by Physfitfreak
Cause some new girl in the warehouse
The warehouse? You're that old man who walks in front of the forklift
waving little orange flags.

It's good you found a job commensurate with your mental abilities.
RonB
2024-12-16 10:28:14 UTC
Reply
Permalink
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
Sure. But Windows can never give it to them.
It can and it already does on Snapdragon offerings.
Apparently only "sort of."
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-16 16:01:41 UTC
Reply
Permalink
Post by RonB
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
Sure. But Windows can never give it to them.
It can and it already does on Snapdragon offerings.
Apparently only "sort of."
Which is part of why I recommended that ARM enthusiasts go to Apple.
Only Apple actually follows through on their radical decisions.
Microsoft will announce something on Monday, do something half-assed on
Tuesday and abandon the project altogether on Wednesday. Their fortune
comes from the fact that people are reluctant to move away from x86-64.
If and once they do, Microsoft will have a lot of trouble catching up to
what Apple is doing.
--
CrudeSausage
RonB
2024-12-17 08:15:01 UTC
Reply
Permalink
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
Sure. But Windows can never give it to them.
It can and it already does on Snapdragon offerings.
Apparently only "sort of."
Which is part of why I recommended that ARM enthusiasts go to Apple.
Only Apple actually follows through on their radical decisions.
Microsoft will announce something on Monday, do something half-assed on
Tuesday and abandon the project altogether on Wednesday. Their fortune
comes from the fact that people are reluctant to move away from x86-64.
If and once they do, Microsoft will have a lot of trouble catching up to
what Apple is doing.
I get that. But does Apple run these high-end video games that require the
powerful (watt-gobbling) GPUs? I don't know, these video games hold no
interest for me.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-17 13:59:08 UTC
Reply
Permalink
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
Sure. But Windows can never give it to them.
It can and it already does on Snapdragon offerings.
Apparently only "sort of."
Which is part of why I recommended that ARM enthusiasts go to Apple.
Only Apple actually follows through on their radical decisions.
Microsoft will announce something on Monday, do something half-assed on
Tuesday and abandon the project altogether on Wednesday. Their fortune
comes from the fact that people are reluctant to move away from x86-64.
If and once they do, Microsoft will have a lot of trouble catching up to
what Apple is doing.
I get that. But does Apple run these high-end video games that require the
powerful (watt-gobbling) GPUs? I don't know, these video games hold no
interest for me.
There is a community of Apple users getting their Mx machines to run
today's games in the same way Linux users try to get their choice OS to
play them. For what it's worth, it's a lot easier in Linux than it is in
MacOS. A game developed specifically for Macs will run very well on the
hardware because it is indeed a lot more powerful than people realize,
but those titles are very few and are likely to remain so.
--
CrudeSausage
RonB
2024-12-17 20:45:46 UTC
Reply
Permalink
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
Sure. But Windows can never give it to them.
It can and it already does on Snapdragon offerings.
Apparently only "sort of."
Which is part of why I recommended that ARM enthusiasts go to Apple.
Only Apple actually follows through on their radical decisions.
Microsoft will announce something on Monday, do something half-assed on
Tuesday and abandon the project altogether on Wednesday. Their fortune
comes from the fact that people are reluctant to move away from x86-64.
If and once they do, Microsoft will have a lot of trouble catching up to
what Apple is doing.
I get that. But does Apple run these high-end video games that require the
powerful (watt-gobbling) GPUs? I don't know, these video games hold no
interest for me.
There is a community of Apple users getting their Mx machines to run
today's games in the same way Linux users try to get their choice OS to
play them. For what it's worth, it's a lot easier in Linux than it is in
MacOS. A game developed specifically for Macs will run very well on the
hardware because it is indeed a lot more powerful than people realize,
but those titles are very few and are likely to remain so.
For me Macs are too limited. But I actually got Trelby (a 2012 screenwriting
application with recent updates) to work on my MacBook Air last week. Trelby
is based on Python. What took forever, though, was getting Brew and Python
installed on the old Mac (2015).

As for Mac OS's "normal" mode, I just don't like it all. I try to exit its
terminal by typing "exit" it does exit (sort of), but the window stays there
until I close it with the trackpad. But it's still not closed really, it's
minimized (even though I chose close, not minimize). I then have to two
finger click on the application in the dock, navigate down and tap on "quit"
to finally get the damn thing to go away. In Linux I type "exit" — done.

I get it that Mac is good at certain things (mostly for integrating with
other Apple crap) but I want to use an OS the way I want to use it — not be
constrained by an OS that thinks it's your nanny.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-17 23:03:52 UTC
Reply
Permalink
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
Sure. But Windows can never give it to them.
It can and it already does on Snapdragon offerings.
Apparently only "sort of."
Which is part of why I recommended that ARM enthusiasts go to Apple.
Only Apple actually follows through on their radical decisions.
Microsoft will announce something on Monday, do something half-assed on
Tuesday and abandon the project altogether on Wednesday. Their fortune
comes from the fact that people are reluctant to move away from x86-64.
If and once they do, Microsoft will have a lot of trouble catching up to
what Apple is doing.
I get that. But does Apple run these high-end video games that require the
powerful (watt-gobbling) GPUs? I don't know, these video games hold no
interest for me.
There is a community of Apple users getting their Mx machines to run
today's games in the same way Linux users try to get their choice OS to
play them. For what it's worth, it's a lot easier in Linux than it is in
MacOS. A game developed specifically for Macs will run very well on the
hardware because it is indeed a lot more powerful than people realize,
but those titles are very few and are likely to remain so.
For me Macs are too limited. But I actually got Trelby (a 2012 screenwriting
application with recent updates) to work on my MacBook Air last week. Trelby
is based on Python. What took forever, though, was getting Brew and Python
installed on the old Mac (2015).
As for Mac OS's "normal" mode, I just don't like it all. I try to exit its
terminal by typing "exit" it does exit (sort of), but the window stays there
until I close it with the trackpad. But it's still not closed really, it's
minimized (even though I chose close, not minimize). I then have to two
finger click on the application in the dock, navigate down and tap on "quit"
to finally get the damn thing to go away. In Linux I type "exit" — done.
I get it that Mac is good at certain things (mostly for integrating with
other Apple crap) but I want to use an OS the way I want to use it — not be
constrained by an OS that thinks it's your nanny.
I have to admit that I'm not a fan of how the MacOS doesn't close
applications when you click on the red dot in the corner. To be fair
though, this is a practise that other operating systems have borrowed
because there is no real need to terminate an application and reacquire
that memory at a time when there is no shortage of memory on most
hardware. Keeping the application dormant so that it can be restored
more quickly seems to be preferred which is why most Windows
applications and a good number of Linux ones close to the tray rather
than closing entirely.
--
CrudeSausage
RonB
2024-12-18 11:20:40 UTC
Reply
Permalink
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
Sure. But Windows can never give it to them.
It can and it already does on Snapdragon offerings.
Apparently only "sort of."
Which is part of why I recommended that ARM enthusiasts go to Apple.
Only Apple actually follows through on their radical decisions.
Microsoft will announce something on Monday, do something half-assed on
Tuesday and abandon the project altogether on Wednesday. Their fortune
comes from the fact that people are reluctant to move away from x86-64.
If and once they do, Microsoft will have a lot of trouble catching up to
what Apple is doing.
I get that. But does Apple run these high-end video games that require the
powerful (watt-gobbling) GPUs? I don't know, these video games hold no
interest for me.
There is a community of Apple users getting their Mx machines to run
today's games in the same way Linux users try to get their choice OS to
play them. For what it's worth, it's a lot easier in Linux than it is in
MacOS. A game developed specifically for Macs will run very well on the
hardware because it is indeed a lot more powerful than people realize,
but those titles are very few and are likely to remain so.
For me Macs are too limited. But I actually got Trelby (a 2012 screenwriting
application with recent updates) to work on my MacBook Air last week. Trelby
is based on Python. What took forever, though, was getting Brew and Python
installed on the old Mac (2015).
As for Mac OS's "normal" mode, I just don't like it all. I try to exit its
terminal by typing "exit" it does exit (sort of), but the window stays there
until I close it with the trackpad. But it's still not closed really, it's
minimized (even though I chose close, not minimize). I then have to two
finger click on the application in the dock, navigate down and tap on "quit"
to finally get the damn thing to go away. In Linux I type "exit" — done.
I get it that Mac is good at certain things (mostly for integrating with
other Apple crap) but I want to use an OS the way I want to use it — not be
constrained by an OS that thinks it's your nanny.
I have to admit that I'm not a fan of how the MacOS doesn't close
applications when you click on the red dot in the corner. To be fair
though, this is a practise that other operating systems have borrowed
because there is no real need to terminate an application and reacquire
that memory at a time when there is no shortage of memory on most
hardware. Keeping the application dormant so that it can be restored
more quickly seems to be preferred which is why most Windows
applications and a good number of Linux ones close to the tray rather
than closing entirely.
My "real need" is that, when I close an application I want it closed.
Period. If I ran into Linux desktops that worked this way, I wouldn't use
them. As for the amount of time it takes to open an application vs the time
it takes to "unminimize it," it's inconsequential (at least with the
applications I use). The only time I want to minimize applications (instead
of closing them) is when I'm still doing something in the minimized
application. That doesn't happen often. But when I do that on my Mac, I use
the minimize button.

And then it comes down to, what's the point of having a minimize button if
the quit button just minimizes. It seems like someone is confused.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-18 14:20:07 UTC
Reply
Permalink
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
Sure. But Windows can never give it to them.
It can and it already does on Snapdragon offerings.
Apparently only "sort of."
Which is part of why I recommended that ARM enthusiasts go to Apple.
Only Apple actually follows through on their radical decisions.
Microsoft will announce something on Monday, do something half-assed on
Tuesday and abandon the project altogether on Wednesday. Their fortune
comes from the fact that people are reluctant to move away from x86-64.
If and once they do, Microsoft will have a lot of trouble catching up to
what Apple is doing.
I get that. But does Apple run these high-end video games that require the
powerful (watt-gobbling) GPUs? I don't know, these video games hold no
interest for me.
There is a community of Apple users getting their Mx machines to run
today's games in the same way Linux users try to get their choice OS to
play them. For what it's worth, it's a lot easier in Linux than it is in
MacOS. A game developed specifically for Macs will run very well on the
hardware because it is indeed a lot more powerful than people realize,
but those titles are very few and are likely to remain so.
For me Macs are too limited. But I actually got Trelby (a 2012 screenwriting
application with recent updates) to work on my MacBook Air last week. Trelby
is based on Python. What took forever, though, was getting Brew and Python
installed on the old Mac (2015).
As for Mac OS's "normal" mode, I just don't like it all. I try to exit its
terminal by typing "exit" it does exit (sort of), but the window stays there
until I close it with the trackpad. But it's still not closed really, it's
minimized (even though I chose close, not minimize). I then have to two
finger click on the application in the dock, navigate down and tap on "quit"
to finally get the damn thing to go away. In Linux I type "exit" — done.
I get it that Mac is good at certain things (mostly for integrating with
other Apple crap) but I want to use an OS the way I want to use it — not be
constrained by an OS that thinks it's your nanny.
I have to admit that I'm not a fan of how the MacOS doesn't close
applications when you click on the red dot in the corner. To be fair
though, this is a practise that other operating systems have borrowed
because there is no real need to terminate an application and reacquire
that memory at a time when there is no shortage of memory on most
hardware. Keeping the application dormant so that it can be restored
more quickly seems to be preferred which is why most Windows
applications and a good number of Linux ones close to the tray rather
than closing entirely.
My "real need" is that, when I close an application I want it closed.
Period. If I ran into Linux desktops that worked this way, I wouldn't use
them. As for the amount of time it takes to open an application vs the time
it takes to "unminimize it," it's inconsequential (at least with the
applications I use). The only time I want to minimize applications (instead
of closing them) is when I'm still doing something in the minimized
application. That doesn't happen often. But when I do that on my Mac, I use
the minimize button.
And then it comes down to, what's the point of having a minimize button if
the quit button just minimizes. It seems like someone is confused.
I have to admit that minimize becomes useless if close just removes the
window but keeps it running in memory. I imagine that there used to be a
speed benefit to minimizing rather than closing, but it doesn't seem to
be there anymore. Either way, the interface doesn't bother me as much as
it does you.
--
CrudeSausage
-hh
2024-12-18 21:27:40 UTC
Reply
Permalink
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Higher performance per watt which leads to lower power use
and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
Sure. But Windows can never give it to them.
It can and it already does on Snapdragon offerings.
Apparently only "sort of."
Which is part of why I recommended that ARM enthusiasts go to Apple.
Only Apple actually follows through on their radical decisions.
Microsoft will announce something on Monday, do something half-
assed on
Tuesday and abandon the project altogether on Wednesday. Their fortune
comes from the fact that people are reluctant to move away from x86-64.
If and once they do, Microsoft will have a lot of trouble catching up to
what Apple is doing.
I get that. But does Apple run these high-end video games that require the
powerful (watt-gobbling) GPUs? I don't know, these video games hold no
interest for me.
There is a community of Apple users getting their Mx machines to run
today's games in the same way Linux users try to get their choice OS to
play them. For what it's worth, it's a lot easier in Linux than it is in
MacOS. A game developed specifically for Macs will run very well on the
hardware because it is indeed a lot more powerful than people realize,
but those titles are very few and are likely to remain so.
For me Macs are too limited. But I actually got Trelby (a 2012 screenwriting
application with recent updates) to work on my MacBook Air last week. Trelby
is based on Python. What took forever, though, was getting Brew and Python
installed on the old Mac (2015).
As for Mac OS's "normal" mode, I just don't like it all. I try to exit its
terminal by typing "exit" it does exit (sort of), but the window stays there
until I close it with the trackpad. But it's still not closed really, it's
minimized (even though I chose close, not minimize). I then have to two
finger click on the application in the dock, navigate down and tap on "quit"
to finally get the damn thing to go away. In Linux I type "exit" — done.
I get it that Mac is good at certain things (mostly for integrating with
other Apple crap) but I want to use an OS the way I want to use it — not be
constrained by an OS that thinks it's your nanny.
I have to admit that I'm not a fan of how the MacOS doesn't close
applications when you click on the red dot in the corner. To be fair
though, this is a practise that other operating systems have borrowed
because there is no real need to terminate an application and reacquire
that memory at a time when there is no shortage of memory on most
hardware. Keeping the application dormant so that it can be restored
more quickly seems to be preferred which is why most Windows
applications and a good number of Linux ones close to the tray rather
than closing entirely.
My "real need" is that, when I close an application I want it closed.
Period. If I ran into Linux desktops that worked this way, I wouldn't use
them. As for the amount of time it takes to open an application vs the time
it takes to "unminimize it," it's inconsequential (at least with the
applications I use). The only time I want to minimize applications (instead
of closing them) is when I'm still doing something in the minimized
application. That doesn't happen often. But when I do that on my Mac,
I use the minimize button.
And then it comes down to, what's the point of having a minimize button if
the quit button just minimizes. It seems like someone is confused.
I have to admit that minimize becomes useless if close just removes the
window but keeps it running in memory. I imagine that there used to be a
speed benefit to minimizing rather than closing, but it doesn't seem to
be there anymore. Either way, the interface doesn't bother me as much as
it does you.
Allowing Apps to remain in the background was the method years ago to
'speed things up' for switching between them...probably goes back as far
as the Multifinder (classic Mac OS 5, circa 1987).

But these days, both Windows & MacOS have the design philosophy that
closing a document window doesn't necessarily quit the App, because one
is likely to be just ending one's work session with Documents Set A and
will be opening up Document Set B. As such, if one really intends to
quit the App, invoke/use ^Q. Thus, this seems more of a PEBKAC.

-hh
CrudeSausage
2024-12-19 00:46:57 UTC
Reply
Permalink
Post by -hh
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Higher performance per watt which leads to lower power use
and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole
day's work
on a single charge and which won't increase electrical bills.
Sure. But Windows can never give it to them.
It can and it already does on Snapdragon offerings.
Apparently only "sort of."
Which is part of why I recommended that ARM enthusiasts go to Apple.
Only Apple actually follows through on their radical decisions.
Microsoft will announce something on Monday, do something half-
assed on
Tuesday and abandon the project altogether on Wednesday. Their fortune
comes from the fact that people are reluctant to move away from x86-64.
If and once they do, Microsoft will have a lot of trouble catching up to
what Apple is doing.
I get that. But does Apple run these high-end video games that require the
powerful (watt-gobbling) GPUs? I don't know, these video games hold no
interest for me.
There is a community of Apple users getting their Mx machines to run
today's games in the same way Linux users try to get their choice OS to
play them. For what it's worth, it's a lot easier in Linux than it is in
MacOS. A game developed specifically for Macs will run very well on the
hardware because it is indeed a lot more powerful than people realize,
but those titles are very few and are likely to remain so.
For me Macs are too limited. But I actually got Trelby (a 2012 screenwriting
application with recent updates) to work on my MacBook Air last week. Trelby
is based on Python. What took forever, though, was getting Brew and Python
installed on the old Mac (2015).
As for Mac OS's "normal" mode, I just don't like it all. I try to exit its
terminal by typing "exit" it does exit (sort of), but the window stays there
until I close it with the trackpad. But it's still not closed really, it's
minimized (even though I chose close, not minimize). I then have to two
finger click on the application in the dock, navigate down and tap on "quit"
to finally get the damn thing to go away. In Linux I type "exit" — done.
I get it that Mac is good at certain things (mostly for integrating with
other Apple crap) but I want to use an OS the way I want to use it — not be
constrained by an OS that thinks it's your nanny.
I have to admit that I'm not a fan of how the MacOS doesn't close
applications when you click on the red dot in the corner. To be fair
though, this is a practise that other operating systems have borrowed
because there is no real need to terminate an application and reacquire
that memory at a time when there is no shortage of memory on most
hardware. Keeping the application dormant so that it can be restored
more quickly seems to be preferred which is why most Windows
applications and a good number of Linux ones close to the tray rather
than closing entirely.
My "real need" is that, when I close an application I want it closed.
Period. If I ran into Linux desktops that worked this way, I wouldn't use
them. As for the amount of time it takes to open an application vs the time
it takes to "unminimize it," it's inconsequential (at least with the
applications I use). The only time I want to minimize applications (instead
of closing them) is when I'm still doing something in the minimized
application. That doesn't happen often. But when I do that on my Mac,
I use the minimize button.
And then it comes down to, what's the point of having a minimize button if
the quit button just minimizes. It seems like someone is confused.
I have to admit that minimize becomes useless if close just removes
the window but keeps it running in memory. I imagine that there used
to be a speed benefit to minimizing rather than closing, but it
doesn't seem to be there anymore. Either way, the interface doesn't
bother me as much as it does you.
Allowing Apps to remain in the background was the method years ago to
'speed things up' for switching between them...probably goes back as far
as the Multifinder (classic Mac OS 5, circa 1987).
But these days, both Windows & MacOS have the design philosophy that
closing a document window doesn't necessarily quit the App, because one
is likely to be just ending one's work session with Documents Set A and
will be opening up Document Set B.  As such, if one really intends to
quit the App, invoke/use ^Q.  Thus, this seems more of a PEBKAC.
I'm not a fan of close minimizing to the taskbar but I can live with
close minimizing to the system tray. If I had little RAM and the
operating system behaved this way, I'd be pretty pissed but we all have
way more RAM than our operating systems will even need for the kinds of
tasks we do.

I know that vallor needs 128GB and 192 cores to calculate the
probability that he'll ever see his dick again but most of us don't need
that much power.
--
CrudeSausage
pothead
2024-12-20 01:35:54 UTC
Reply
Permalink
Post by -hh
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Higher performance per watt which leads to lower power use
and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole
day's work
on a single charge and which won't increase electrical bills.
Sure. But Windows can never give it to them.
It can and it already does on Snapdragon offerings.
Apparently only "sort of."
Which is part of why I recommended that ARM enthusiasts go to Apple.
Only Apple actually follows through on their radical decisions.
Microsoft will announce something on Monday, do something half-
assed on
Tuesday and abandon the project altogether on Wednesday. Their fortune
comes from the fact that people are reluctant to move away from x86-64.
If and once they do, Microsoft will have a lot of trouble catching up to
what Apple is doing.
I get that. But does Apple run these high-end video games that require the
powerful (watt-gobbling) GPUs? I don't know, these video games hold no
interest for me.
There is a community of Apple users getting their Mx machines to run
today's games in the same way Linux users try to get their choice OS to
play them. For what it's worth, it's a lot easier in Linux than it is in
MacOS. A game developed specifically for Macs will run very well on the
hardware because it is indeed a lot more powerful than people realize,
but those titles are very few and are likely to remain so.
For me Macs are too limited. But I actually got Trelby (a 2012 screenwriting
application with recent updates) to work on my MacBook Air last week. Trelby
is based on Python. What took forever, though, was getting Brew and Python
installed on the old Mac (2015).
As for Mac OS's "normal" mode, I just don't like it all. I try to exit its
terminal by typing "exit" it does exit (sort of), but the window stays there
until I close it with the trackpad. But it's still not closed really, it's
minimized (even though I chose close, not minimize). I then have to two
finger click on the application in the dock, navigate down and tap on "quit"
to finally get the damn thing to go away. In Linux I type "exit" — done.
I get it that Mac is good at certain things (mostly for integrating with
other Apple crap) but I want to use an OS the way I want to use it — not be
constrained by an OS that thinks it's your nanny.
I have to admit that I'm not a fan of how the MacOS doesn't close
applications when you click on the red dot in the corner. To be fair
though, this is a practise that other operating systems have borrowed
because there is no real need to terminate an application and reacquire
that memory at a time when there is no shortage of memory on most
hardware. Keeping the application dormant so that it can be restored
more quickly seems to be preferred which is why most Windows
applications and a good number of Linux ones close to the tray rather
than closing entirely.
My "real need" is that, when I close an application I want it closed.
Period. If I ran into Linux desktops that worked this way, I wouldn't use
them. As for the amount of time it takes to open an application vs the time
it takes to "unminimize it," it's inconsequential (at least with the
applications I use). The only time I want to minimize applications (instead
of closing them) is when I'm still doing something in the minimized
application. That doesn't happen often. But when I do that on my Mac,
I use the minimize button.
And then it comes down to, what's the point of having a minimize button if
the quit button just minimizes. It seems like someone is confused.
I have to admit that minimize becomes useless if close just removes the
window but keeps it running in memory. I imagine that there used to be a
speed benefit to minimizing rather than closing, but it doesn't seem to
be there anymore. Either way, the interface doesn't bother me as much as
it does you.
Allowing Apps to remain in the background was the method years ago to
'speed things up' for switching between them...probably goes back as far
as the Multifinder (classic Mac OS 5, circa 1987).
. Thus, this seems more of a PEBKAC.
-hh
Back in the stone age this was called a TSR. Terminate and stay resident.
--
pothead

All about snit read below. Links courtesy of Ron:

Example of Snit trolling in real time:

<https://groups.google.com/g/comp.os.linux.advocacy/c/biFilzgCcVg/m/eUcNGw6lP7UJ>

All about the snit troll:

<https://web.archive.org/web/20181028000459/http://www.cosmicpenguin.com/snit.html>
<https://web.archive.org/web/20190529043314/http://cosmicpenguin.com/snitlist.html>
<https://web.archive.org/web/20190529062255/http://cosmicpenguin.com/snitLieMethods.html>
RonB
2024-12-19 08:06:51 UTC
Reply
Permalink
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by RonB
Post by CrudeSausage
Post by Lawrence D'Oliveiro
Post by CrudeSausage
Higher performance per watt which leads to lower power use and therefore
improved battery life. Whether Intel and AMD want to admit it or not,
people _do_ want to have a computer which can handle a whole day's work
on a single charge and which won't increase electrical bills.
Sure. But Windows can never give it to them.
It can and it already does on Snapdragon offerings.
Apparently only "sort of."
Which is part of why I recommended that ARM enthusiasts go to Apple.
Only Apple actually follows through on their radical decisions.
Microsoft will announce something on Monday, do something half-assed on
Tuesday and abandon the project altogether on Wednesday. Their fortune
comes from the fact that people are reluctant to move away from x86-64.
If and once they do, Microsoft will have a lot of trouble catching up to
what Apple is doing.
I get that. But does Apple run these high-end video games that require the
powerful (watt-gobbling) GPUs? I don't know, these video games hold no
interest for me.
There is a community of Apple users getting their Mx machines to run
today's games in the same way Linux users try to get their choice OS to
play them. For what it's worth, it's a lot easier in Linux than it is in
MacOS. A game developed specifically for Macs will run very well on the
hardware because it is indeed a lot more powerful than people realize,
but those titles are very few and are likely to remain so.
For me Macs are too limited. But I actually got Trelby (a 2012 screenwriting
application with recent updates) to work on my MacBook Air last week. Trelby
is based on Python. What took forever, though, was getting Brew and Python
installed on the old Mac (2015).
As for Mac OS's "normal" mode, I just don't like it all. I try to exit its
terminal by typing "exit" it does exit (sort of), but the window stays there
until I close it with the trackpad. But it's still not closed really, it's
minimized (even though I chose close, not minimize). I then have to two
finger click on the application in the dock, navigate down and tap on "quit"
to finally get the damn thing to go away. In Linux I type "exit" — done.
I get it that Mac is good at certain things (mostly for integrating with
other Apple crap) but I want to use an OS the way I want to use it — not be
constrained by an OS that thinks it's your nanny.
I have to admit that I'm not a fan of how the MacOS doesn't close
applications when you click on the red dot in the corner. To be fair
though, this is a practise that other operating systems have borrowed
because there is no real need to terminate an application and reacquire
that memory at a time when there is no shortage of memory on most
hardware. Keeping the application dormant so that it can be restored
more quickly seems to be preferred which is why most Windows
applications and a good number of Linux ones close to the tray rather
than closing entirely.
My "real need" is that, when I close an application I want it closed.
Period. If I ran into Linux desktops that worked this way, I wouldn't use
them. As for the amount of time it takes to open an application vs the time
it takes to "unminimize it," it's inconsequential (at least with the
applications I use). The only time I want to minimize applications (instead
of closing them) is when I'm still doing something in the minimized
application. That doesn't happen often. But when I do that on my Mac, I use
the minimize button.
And then it comes down to, what's the point of having a minimize button if
the quit button just minimizes. It seems like someone is confused.
I have to admit that minimize becomes useless if close just removes the
window but keeps it running in memory. I imagine that there used to be a
speed benefit to minimizing rather than closing, but it doesn't seem to
be there anymore. Either way, the interface doesn't bother me as much as
it does you.
I guess I'm a little "OCD" (if that's the right term) about some things.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-14 12:51:48 UTC
Reply
Permalink
Post by Joel
https://www.tomshardware.com/pc-components/cpus/intels-interim-co-ceo-claims-retailers-are-concerned-by-return-rate-of-qualcomm-powered-machines
Bottom line, people are having an issue with familiarity, returning a
Windows ARM device. It didn't operate quite the same way. Duh, it's
a good thing, it's superior tech for a laptop. Don't just return it.
Unlike Apple which made sure that its compatibility layer allowed most
existing software to continue working when they made the architecture
change (from 68000 to PowerPC, from PowerPC to Intel, from Intel to
ARM), Microsoft couldn't be bothered to do the same. They must have run
out of resources.
--
CrudeSausage
RonB
2024-12-14 15:58:03 UTC
Reply
Permalink
Post by CrudeSausage
Post by Joel
https://www.tomshardware.com/pc-components/cpus/intels-interim-co-ceo-claims-retailers-are-concerned-by-return-rate-of-qualcomm-powered-machines
Bottom line, people are having an issue with familiarity, returning a
Windows ARM device. It didn't operate quite the same way. Duh, it's
a good thing, it's superior tech for a laptop. Don't just return it.
I think you (Joel) are kind of missing the point. The reason Microsoft has
retained an almost monopoly status in the desktop OS market is because it
runs the supposedly "necessary" Windows applications. I'm guessing the ARM
CPU versions of Windows computers have limitations in this regard.
Post by CrudeSausage
Unlike Apple which made sure that its compatibility layer allowed most
existing software to continue working when they made the architecture
change (from 68000 to PowerPC, from PowerPC to Intel, from Intel to
ARM), Microsoft couldn't be bothered to do the same. They must have run
out of resources.
I think you're right. Microsoft is always trying to "catch up" on new trends
without putting in the work and getting it right. They've got the "the
customer is our beta tester" attitude. Looks like another case of doing a
half-assed rollout is going bite them in the butt.
--
“Evil is not able to create anything new, it can only distort and destroy
what has been invented or made by the forces of good.” —J.R.R. Tolkien
CrudeSausage
2024-12-14 18:01:57 UTC
Reply
Permalink
Post by RonB
Post by CrudeSausage
Post by Joel
https://www.tomshardware.com/pc-components/cpus/intels-interim-co-ceo-claims-retailers-are-concerned-by-return-rate-of-qualcomm-powered-machines
Bottom line, people are having an issue with familiarity, returning a
Windows ARM device. It didn't operate quite the same way. Duh, it's
a good thing, it's superior tech for a laptop. Don't just return it.
I think you (Joel) are kind of missing the point. The reason Microsoft has
retained an almost monopoly status in the desktop OS market is because it
runs the supposedly "necessary" Windows applications. I'm guessing the ARM
CPU versions of Windows computers have limitations in this regard.
I haven't tried it myself but I would believe that the ARM version of
most programs doesn't exist and whatever compatibility layer Microsoft
offers is grossly insufficient to make the x86-64 software work reliably
or perform under the new architecture.
Post by RonB
Post by CrudeSausage
Unlike Apple which made sure that its compatibility layer allowed most
existing software to continue working when they made the architecture
change (from 68000 to PowerPC, from PowerPC to Intel, from Intel to
ARM), Microsoft couldn't be bothered to do the same. They must have run
out of resources.
I think you're right. Microsoft is always trying to "catch up" on new trends
without putting in the work and getting it right. They've got the "the
customer is our beta tester" attitude. Looks like another case of doing a
half-assed rollout is going bite them in the butt.
I will say this much: if ARM becomes the only game in town by the time I
am ready to upgrade from this machine, I won't be bothering with
Microsoft at all. By then, it won't bother me one bit that I will be
abandoning the rather large movie collection I've created in Microsoft
Films & TV if it means avoiding the exhibition of gross incompetence
Microsoft's venture into an ARM-only environment will be.
--
CrudeSausage
Loading...