Java seems to be just asking for problems down the road. "This software needs version <x>, but this one needs version <y>." "Both of those are 20 years old..."
But I guess that is more of a problem not up keeping/updating software than anything inherit with Java.
Java seems to be just asking for problems down the road. "This software needs version <x>, but this one needs version <y>." "Both of those are 20 years old..."
But I guess that is more of a problem not up keeping/updating software than anything inherit with Java.
Java at least has enough modern features that improve readability and has enough in common with other modern languages that if they have to upgrade it again in the future, it will be much less of a hassle. COBOL is basically a step above assembly, so working out how large programs are structured is significantly harder and requires more COBOL specific expertise.
0
Options
MonwynApathy's a tragedy, and boredom is a crime.A little bit of everything, all of the time.Registered Userregular
Once you know one language you know them all. You could create a finance-centric language that's very easy to learn if you wanted. Then you could build security into the language to make it financially idiot-proof, which would help with all that hacking we've been having recently.
There are too many languages around with the use case of "programmers think it's cool" and too few practical languages.
IIRC COBOL is procedural rather than object-oriented and has some deeply, deeply weird limitations on how you can order operations compared to modern languages.
Making a purpose-built finance/commerce-focused language isn't a bad idea, but also, like, just write shit in C/C++/C#, it's never going to be deprecated
Most stuff in FinServ is built to be broadly compatible and deployable, which Java is. It's also faster than JS and Python at math etc.
That said, lots of people hate writing java and you can find people writing in basically everything given the massive spend (banks spend ~10% of revenue on technology) and number of them. There's even a place with it's core as ocaml
Anything that incentivises US banks to get up to standards set by the rest of the world would be good. Last time I was in the US it was like stepping back in time 30 years.
Considering how much bank tech is still rooted in the 70s (in design, if not actual machines) you kinda were.
There is a reason that COBOL programmers make $$$.
Eh. That's true plenty of places. The US banking system being so outdated feels more like a cultural/domestic market issue to me. It's like they just didn't feel like advancing beyond cash.
I have days where I really wish that when I started getting known as our AS/400 COBOL 'guru' early in my career I had stuck with it instead of pulling the ripcord.
Then there are days I'm so very glad I'm doing what I'm doing instead of spending my days battling sixty year old emulated code.
I have days where I really wish that when I started getting known as our AS/400 COBOL 'guru' early in my career I had stuck with it instead of pulling the ripcord.
Then there are days I'm so very glad I'm doing what I'm doing instead of spending my days battling sixty year old emulated code.
Watching what my dad went through being a COBOL guy and how the bank treated him just as soon as they had moved everything to java, be glad you pulled the ripcord.
The EU and US have agreed to bury the hatchet for 5 years on the 20 year trade conflict over illegal EU subsidies of Airbus. Which is generally good news.
But despite commodity prices going through the roof, and supply problems across multiple industries they are keeping the Trump metals tariffs in place. Because they are popular with steelworkers unions.
The EU and US have agreed to bury the hatchet for 5 years on the 20 year trade conflict over illegal EU subsidies of Airbus. Which is generally good news.
But despite commodity prices going through the roof, and supply problems across multiple industries they are keeping the Trump metals tariffs in place. Because they are popular with steelworkers unions.
Joseph Zeballos-Roig is an economics reporter for Business Insider:
NOTABLE: Sen. Tim Kaine tells me that *immigration reform* was discussed among Dems in budget meeting as a method of financing infrastructure package in reconciliation
“That could be a very legitimate way to look at trying to find a balanced package”
Kaine: “Anytime there’s been a CBO examination on immigration reform, it produces a significant increase in the GDP without really costing much money”
I suppose this would be an interesting way to have your cake and eat it too.
Apparently an outsized contributer to recent inflation is used car prices which are spiking because of rapidly increasing demand as car rental demand recovers and the whole chip shortage is constricting supply of new cars.
While racing light mechs, your Urbanmech comes in second place, but only because it ran out of ammo.
+3
Options
HedgethornAssociate Professor of Historical Hobby HorsesIn the Lions' DenRegistered Userregular
Apparently an outsized contributer to recent inflation is used car prices which are spiking because of rapidly increasing demand as car rental demand recovers and the whole chip shortage is constricting supply of new cars.
Yeah, used cars accounted for about 30% of May's inflation reading, and new cars, rental cars, and airline tickets combined for another 15-20% hour of the total, if I recall correctly.
There are going to be weird spot shortages and things for the next few months while everything works it's way through the system with more and more places reopening. But, by its very nature that is a temporary thing rather than an underlying trend.
I wonder how much all that bitcoin farming (and other blockchain bollocks) is to blame for the shortage of chips.
Not much. The shortage is more closely linked to the pandemic and working conditions and the fact that we've allowed one country to dominate the manufacturing of such things.
I wonder how much all that bitcoin farming (and other blockchain bollocks) is to blame for the shortage of chips.
Not much. The shortage is more closely linked to the pandemic and working conditions and the fact that we've allowed one country to dominate the manufacturing of such things.
That link cites crypto mining as the driving factor behind the shortage of video cards. It's a big problem in the GPU market that is having a wider effects as there are only so many semiconductors to go around right now.
I wonder how much all that bitcoin farming (and other blockchain bollocks) is to blame for the shortage of chips.
Not much. The shortage is more closely linked to the pandemic and working conditions and the fact that we've allowed one country to dominate the manufacturing of such things.
That link cites crypto mining as the driving factor behind the shortage of video cards. It's a big problem in the GPU market that is having a wider effects as there are only so many semiconductors to go around right now.
There are only so many semiconductor manufacturing plants. The ones used for the car chips and the ones used by AMD & NVidia to give them wafers, such as TMSC have only so much capacity and it's all first come, first served. Intel mostly has their own plants for the GPU side but as of yet, no one takes them all that seriously despite them spending $BILLIONS on trying to break into it.
That bill which passed earlier this year is supposed to help subsidize the opening of new generalized plants like TMSC stateside but it will take years before those are running at full capacity, let alone proof of operations. In the meantime, the easiest way to get around this would be to back off of China, but I don't see that happening with the current political climate around the world.
All opinions are my own and in no way reflect that of my employer.
Not only are there only so many plants but processes are not interchangable and even different manufacturers within a process may not be compatible
The plant that is making the latest GPUs is not capable of making the chips that go into cars for example. One is a 7nm/10nm node and the other is going to be much larger because it's cheaper and more reliable. These nodes often use entirely different equipment to make and require different designs
I worked for years in technology development yield at a major semi conductor company when they started failing to acheive acceptable yields on new generation chips.
So, its me guys, im the reason for the shortage.
+40
Options
OrcaAlso known as EspressosaurusWrexRegistered Userregular
This Saturday, the nation recognizes Juneteenth, which marks the day a Major General of the Union Army arrived in Galveston, Texas to enforce the Emancipation Proclamation, and free the last enslaved Black people in Texas from bondage. The day has evolved into a celebration of emancipation more generally, and while the country acknowledges the progress that has been made, it is imperative to not lose sight of the fact that we still have much work to do to address the vestiges of slavery and historic discrimination. Indeed, policies and practices exist today that are seemingly non-discriminatory on their face but still negatively affect many families of color, especially Black families. Many of these policies and practices have long-term impacts—from education to employment to business ownership to housing—that must be addressed.
One area that is particularly important for economic well-being and wealth accumulation is housing. Families who can purchase their own home in the neighborhood of their choice at a fair price and see the value of their home grow over time do better economically in the long run. But numerous policies have systemically discriminated against Black families who wish to pursue that path. This blog focuses on one of these policies: exclusionary zoning laws, which have played a role in causing racial disparities in the housing market.
And all we had to do was create a technology never before seen in the world thats constantly pushing up against Planks, at a reliable 2 year cadence under one of the most notoriously abusive and inneffective corporate structures out there.
And all we had to do was create a technology never before seen in the world thats constantly pushing up against Planks, at a reliable 2 year cadence under one of the most notoriously abusive and inneffective corporate structures out there.
It couldn't be simpler!
i don't know what any of this means but i'm still mad :mad:
And all we had to do was create a technology never before seen in the world thats constantly pushing up against Planks, at a reliable 2 year cadence under one of the most notoriously abusive and inneffective corporate structures out there.
It couldn't be simpler!
Someday someone is going to write the story of intel 10nm and it is going to be very exciting.
And all we had to do was create a technology never before seen in the world thats constantly pushing up against Planks, at a reliable 2 year cadence under one of the most notoriously abusive and inneffective corporate structures out there.
It couldn't be simpler!
i don't know what any of this means but i'm still mad :mad:
Planks Constantthermal limits and quantum tunneling are things that puts a lower bound on how small you can make a working transistor, and computer chips have been edging closer and closer to that line for the last decade or so. Intel was King for a bit there in this space, but when they tried to move from 14nm down to 10nm, they had some issues they still haven’t completely resolved. Part of this is because intels management appeared to be garbage.
This is my “explain it like I’m 5”, so it may not be totally accurate but I believe it’s essentially correct.
Edit: also, other companies that were behind intel and R&D have since solved the problems going to 10nm and lower while intel was stuck.
And all we had to do was create a technology never before seen in the world thats constantly pushing up against Planks, at a reliable 2 year cadence under one of the most notoriously abusive and inneffective corporate structures out there.
It couldn't be simpler!
i don't know what any of this means but i'm still mad :mad:
Planks Constant is something that puts a lower bound on how small you can make a thing, and computer chips have been edging closer and closer to that line for the last decade or so. Intel was King for a bit there in this space, but when they tried to move from 14nm down to 10nm, they had some issues they still haven’t completely resolved. Part of this is because intels management appeared to be garbage.
This is my “explain it like I’m 5”, so it may not be totally accurate but I believe it’s essentially correct.
Edit: also, other companies that were behind intel and R&D have since solved the problems going to 10nm and lower while intel was stuck.
There are reasons why Apple decided to drop Intel.
And all we had to do was create a technology never before seen in the world thats constantly pushing up against Planks, at a reliable 2 year cadence under one of the most notoriously abusive and inneffective corporate structures out there.
It couldn't be simpler!
i don't know what any of this means but i'm still mad :mad:
Planks Constant is something that puts a lower bound on how small you can make a thing, and computer chips have been edging closer and closer to that line for the last decade or so. Intel was King for a bit there in this space, but when they tried to move from 14nm down to 10nm, they had some issues they still haven’t completely resolved. Part of this is because intels management appeared to be garbage.
This is my “explain it like I’m 5”, so it may not be totally accurate but I believe it’s essentially correct.
Edit: also, other companies that were behind intel and R&D have since solved the problems going to 10nm and lower while intel was stuck.
No, the Planck Length (the smallest quanta of length possible in the universe) is 1000000000000000000000000000 times smaller than 10nm. You start getting into thermal and quantum tunneling issues in the silicon and resolution for the masking as chip feature sizes get smaller, but the Planck Length isn't relevant.
SiliconStew on
Just remember that half the people you meet are below average intelligence.
And all we had to do was create a technology never before seen in the world thats constantly pushing up against Planks, at a reliable 2 year cadence under one of the most notoriously abusive and inneffective corporate structures out there.
It couldn't be simpler!
i don't know what any of this means but i'm still mad :mad:
Planks Constant is something that puts a lower bound on how small you can make a thing, and computer chips have been edging closer and closer to that line for the last decade or so. Intel was King for a bit there in this space, but when they tried to move from 14nm down to 10nm, they had some issues they still haven’t completely resolved. Part of this is because intels management appeared to be garbage.
This is my “explain it like I’m 5”, so it may not be totally accurate but I believe it’s essentially correct.
Edit: also, other companies that were behind intel and R&D have since solved the problems going to 10nm and lower while intel was stuck.
No, the Planck Length (the smallest quanta of length possible in the universe) is 1000000000000000000000000000 times smaller than 10nm. You start getting into thermal and quantum tunneling issues in the silicon and photon frequency resolution for the masking as chip feature sizes get smaller, but the Planck Length isn't relevant.
The Planck units are annoying to me from a mathematical standpoint, as they mean space and time are not continuous in a numerical sense.
And all we had to do was create a technology never before seen in the world thats constantly pushing up against Planks, at a reliable 2 year cadence under one of the most notoriously abusive and inneffective corporate structures out there.
It couldn't be simpler!
i don't know what any of this means but i'm still mad :mad:
Planks Constant is something that puts a lower bound on how small you can make a thing, and computer chips have been edging closer and closer to that line for the last decade or so. Intel was King for a bit there in this space, but when they tried to move from 14nm down to 10nm, they had some issues they still haven’t completely resolved. Part of this is because intels management appeared to be garbage.
This is my “explain it like I’m 5”, so it may not be totally accurate but I believe it’s essentially correct.
Edit: also, other companies that were behind intel and R&D have since solved the problems going to 10nm and lower while intel was stuck.
No, the Planck Length (the smallest quanta of length possible in the universe) is 1000000000000000000000000000 times smaller than 10nm. You start getting into thermal and quantum tunneling issues in the silicon and photon frequency resolution for the masking as chip feature sizes get smaller, but the Planck Length isn't relevant.
It's still very relevant in this case because of the energy levels of the photons being used to do the etching of the silicon. It doesn't take a whole lot to overcome the energy needed to not simply displace the atoms within the crystal matrix but remove them entirely which leads to lower yields and other hidden errors found later on. All of that has everything to do with plank length and how well you measure and the resolution of the image. It's a very fine dance around physics and at the levels it's currently being done at entirely a form of magic afaict.
All opinions are my own and in no way reflect that of my employer.
And all we had to do was create a technology never before seen in the world thats constantly pushing up against Planks, at a reliable 2 year cadence under one of the most notoriously abusive and inneffective corporate structures out there.
It couldn't be simpler!
i don't know what any of this means but i'm still mad :mad:
Planks Constant is something that puts a lower bound on how small you can make a thing, and computer chips have been edging closer and closer to that line for the last decade or so. Intel was King for a bit there in this space, but when they tried to move from 14nm down to 10nm, they had some issues they still haven’t completely resolved. Part of this is because intels management appeared to be garbage.
This is my “explain it like I’m 5”, so it may not be totally accurate but I believe it’s essentially correct.
Edit: also, other companies that were behind intel and R&D have since solved the problems going to 10nm and lower while intel was stuck.
No, the Planck Length (the smallest quanta of length possible in the universe) is 1000000000000000000000000000 times smaller than 10nm. You start getting into thermal and quantum tunneling issues in the silicon and photon frequency resolution for the masking as chip feature sizes get smaller, but the Planck Length isn't relevant.
It's still very relevant in this case because of the energy levels of the photons being used to do the etching of the silicon. It doesn't take a whole lot to overcome the energy needed to not simply displace the atoms within the crystal matrix but remove them entirely which leads to lower yields and other hidden errors found later on. All of that has everything to do with plank length and how well you measure and the resolution of the image. It's a very fine dance around physics and at the levels it's currently being done at entirely a form of magic afaict.
You are correct that the higher frequency needed to support the masking resolution starts causing manufacturing issues as the size gets smaller. But Planck's Constant is only "involved" because it relates photon energy to frequency. It's not a cause of any problem itself, it's just a constant.
And Planck Length is utterly irrelevant because we are not anywhere close to dealing with things at that scale. That number I gave for the difference in size is not an exaggeration. If 1 Planck Length was scaled up to 1 meter, that 10nm chip feature would be 10 times larger than the diameter of the observable universe. That vast, vast difference means you don't need the edges of your features to be Planck Length accurate even if it were in any way possible to do it.
SiliconStew on
Just remember that half the people you meet are below average intelligence.
And all we had to do was create a technology never before seen in the world thats constantly pushing up against Planks, at a reliable 2 year cadence under one of the most notoriously abusive and inneffective corporate structures out there.
It couldn't be simpler!
i don't know what any of this means but i'm still mad :mad:
Planks Constant is something that puts a lower bound on how small you can make a thing, and computer chips have been edging closer and closer to that line for the last decade or so. Intel was King for a bit there in this space, but when they tried to move from 14nm down to 10nm, they had some issues they still haven’t completely resolved. Part of this is because intels management appeared to be garbage.
This is my “explain it like I’m 5”, so it may not be totally accurate but I believe it’s essentially correct.
Edit: also, other companies that were behind intel and R&D have since solved the problems going to 10nm and lower while intel was stuck.
No, the Planck Length (the smallest quanta of length possible in the universe) is 1000000000000000000000000000 times smaller than 10nm. You start getting into thermal and quantum tunneling issues in the silicon and photon frequency resolution for the masking as chip feature sizes get smaller, but the Planck Length isn't relevant.
It's still very relevant in this case because of the energy levels of the photons being used to do the etching of the silicon. It doesn't take a whole lot to overcome the energy needed to not simply displace the atoms within the crystal matrix but remove them entirely which leads to lower yields and other hidden errors found later on. All of that has everything to do with plank length and how well you measure and the resolution of the image. It's a very fine dance around physics and at the levels it's currently being done at entirely a form of magic afaict.
You are correct that the higher frequency needed to support the masking resolution starts causing manufacturing issues as the size gets smaller. But Planck's Constant is only "involved" because it relates photon energy to frequency. It's not a cause of any problem itself, it's just a constant.
And Planck Length is utterly irrelevant because we are not anywhere close to dealing with things at that scale. That number I gave for the difference in size is not an exaggeration. If 1 Planck Length was scaled up to 1 meter, that 10nm chip feature would be 10 times larger than the diameter of the observable universe. That vast, vast difference means you don't need the edges of your features to be Planck Length accurate even if it were in any way possible to do it.
Thanks there are brain bits all over my screen now
And all we had to do was create a technology never before seen in the world thats constantly pushing up against Planks, at a reliable 2 year cadence under one of the most notoriously abusive and inneffective corporate structures out there.
It couldn't be simpler!
i don't know what any of this means but i'm still mad :mad:
Planks Constant is something that puts a lower bound on how small you can make a thing, and computer chips have been edging closer and closer to that line for the last decade or so. Intel was King for a bit there in this space, but when they tried to move from 14nm down to 10nm, they had some issues they still haven’t completely resolved. Part of this is because intels management appeared to be garbage.
This is my “explain it like I’m 5”, so it may not be totally accurate but I believe it’s essentially correct.
Edit: also, other companies that were behind intel and R&D have since solved the problems going to 10nm and lower while intel was stuck.
No, the Planck Length (the smallest quanta of length possible in the universe) is 1000000000000000000000000000 times smaller than 10nm. You start getting into thermal and quantum tunneling issues in the silicon and photon frequency resolution for the masking as chip feature sizes get smaller, but the Planck Length isn't relevant.
The Planck units are annoying to me from a mathematical standpoint, as they mean space and time are not continuous in a numerical sense.
Nobody knows if the Planck length is the smallest quanta of length in the universe, or even if length is quantized. (According to We have no idea, wirtten by a CERN phycisists and the guy who makes phd comics.)
In any case, from a practical point of view it doesn't matter for any but the most theoretical of phycisists; for everyone else we can treat space (and time) and continuous (except when doing stuff on a digital computer, but that's because the computer has finite storage and so limited resolution).
From a mathematical point of view, though, you get increadibly strange effects things are not quantized. Things are continuous, there is a way of cutting a sphere of gold in two so that both halves have the same volume and same mass as the original. (In real life, gold is quantized into gold atoms so you can't.)
Fluid mechanics treat space, time, and fluids as continuous. But at least the latter isn't, so the entire field is axiomatically no more than an approximation of reality.
Posts
But I guess that is more of a problem not up keeping/updating software than anything inherit with Java.
Java at least has enough modern features that improve readability and has enough in common with other modern languages that if they have to upgrade it again in the future, it will be much less of a hassle. COBOL is basically a step above assembly, so working out how large programs are structured is significantly harder and requires more COBOL specific expertise.
IIRC COBOL is procedural rather than object-oriented and has some deeply, deeply weird limitations on how you can order operations compared to modern languages.
Making a purpose-built finance/commerce-focused language isn't a bad idea, but also, like, just write shit in C/C++/C#, it's never going to be deprecated
That said, lots of people hate writing java and you can find people writing in basically everything given the massive spend (banks spend ~10% of revenue on technology) and number of them. There's even a place with it's core as ocaml
Eh. That's true plenty of places. The US banking system being so outdated feels more like a cultural/domestic market issue to me. It's like they just didn't feel like advancing beyond cash.
Seriously only 60% of US banking customers even use online banking
And half of the people in bank branches are business employees doing cash exchange for their registers
Source: it’s my job to shrink bank branches to a tiny kiosk with a person holding a tablet and no cash handling
Then there are days I'm so very glad I'm doing what I'm doing instead of spending my days battling sixty year old emulated code.
Watching what my dad went through being a COBOL guy and how the bank treated him just as soon as they had moved everything to java, be glad you pulled the ripcord.
Take it to your bank (or worst case grocery store since they charge a fee) and turn it into real money.
But despite commodity prices going through the roof, and supply problems across multiple industries they are keeping the Trump metals tariffs in place. Because they are popular with steelworkers unions.
https://www.washingtonpost.com/politics/biden-eu-tariffs/2021/06/15/88fcfe92-cd4c-11eb-a7f1-52b8870bef7c_story.html?utm_campaign=wp_main&utm_medium=social&utm_source=facebook
Goddamnit, that is one of the stupidest ones. It's a base input material.
Joseph Zeballos-Roig is an economics reporter for Business Insider:
I suppose this would be an interesting way to have your cake and eat it too.
Apparently an outsized contributer to recent inflation is used car prices which are spiking because of rapidly increasing demand as car rental demand recovers and the whole chip shortage is constricting supply of new cars.
Yeah, used cars accounted for about 30% of May's inflation reading, and new cars, rental cars, and airline tickets combined for another 15-20% hour of the total, if I recall correctly.
Not much. The shortage is more closely linked to the pandemic and working conditions and the fact that we've allowed one country to dominate the manufacturing of such things.
https://en.wikipedia.org/wiki/2020–2021_global_chip_shortage
That link cites crypto mining as the driving factor behind the shortage of video cards. It's a big problem in the GPU market that is having a wider effects as there are only so many semiconductors to go around right now.
There are only so many semiconductor manufacturing plants. The ones used for the car chips and the ones used by AMD & NVidia to give them wafers, such as TMSC have only so much capacity and it's all first come, first served. Intel mostly has their own plants for the GPU side but as of yet, no one takes them all that seriously despite them spending $BILLIONS on trying to break into it.
That bill which passed earlier this year is supposed to help subsidize the opening of new generalized plants like TMSC stateside but it will take years before those are running at full capacity, let alone proof of operations. In the meantime, the easiest way to get around this would be to back off of China, but I don't see that happening with the current political climate around the world.
The plant that is making the latest GPUs is not capable of making the chips that go into cars for example. One is a 7nm/10nm node and the other is going to be much larger because it's cheaper and more reliable. These nodes often use entirely different equipment to make and require different designs
So, its me guys, im the reason for the shortage.
I know!
And all we had to do was create a technology never before seen in the world thats constantly pushing up against Planks, at a reliable 2 year cadence under one of the most notoriously abusive and inneffective corporate structures out there.
It couldn't be simpler!
i don't know what any of this means but i'm still mad :mad:
Someday someone is going to write the story of intel 10nm and it is going to be very exciting.
Planks Constantthermal limits and quantum tunneling are things that puts a lower bound on how small you can make a working transistor, and computer chips have been edging closer and closer to that line for the last decade or so. Intel was King for a bit there in this space, but when they tried to move from 14nm down to 10nm, they had some issues they still haven’t completely resolved. Part of this is because intels management appeared to be garbage.
This is my “explain it like I’m 5”, so it may not be totally accurate but I believe it’s essentially correct.
Edit: also, other companies that were behind intel and R&D have since solved the problems going to 10nm and lower while intel was stuck.
There are reasons why Apple decided to drop Intel.
No, the Planck Length (the smallest quanta of length possible in the universe) is 1000000000000000000000000000 times smaller than 10nm. You start getting into thermal and quantum tunneling issues in the silicon and resolution for the masking as chip feature sizes get smaller, but the Planck Length isn't relevant.
The Planck units are annoying to me from a mathematical standpoint, as they mean space and time are not continuous in a numerical sense.
3DS: 0473-8507-2652
Switch: SW-5185-4991-5118
PSN: AbEntropy
It's still very relevant in this case because of the energy levels of the photons being used to do the etching of the silicon. It doesn't take a whole lot to overcome the energy needed to not simply displace the atoms within the crystal matrix but remove them entirely which leads to lower yields and other hidden errors found later on. All of that has everything to do with plank length and how well you measure and the resolution of the image. It's a very fine dance around physics and at the levels it's currently being done at entirely a form of magic afaict.
Making very small thing is hard
Making very small things even smaller is incredible hard, due to physics.
Previous market leader in this space had terrible management that made it much, much harder for the smart people to make things smaller.
You are correct that the higher frequency needed to support the masking resolution starts causing manufacturing issues as the size gets smaller. But Planck's Constant is only "involved" because it relates photon energy to frequency. It's not a cause of any problem itself, it's just a constant.
And Planck Length is utterly irrelevant because we are not anywhere close to dealing with things at that scale. That number I gave for the difference in size is not an exaggeration. If 1 Planck Length was scaled up to 1 meter, that 10nm chip feature would be 10 times larger than the diameter of the observable universe. That vast, vast difference means you don't need the edges of your features to be Planck Length accurate even if it were in any way possible to do it.
Thanks there are brain bits all over my screen now
Nobody knows if the Planck length is the smallest quanta of length in the universe, or even if length is quantized. (According to We have no idea, wirtten by a CERN phycisists and the guy who makes phd comics.)
In any case, from a practical point of view it doesn't matter for any but the most theoretical of phycisists; for everyone else we can treat space (and time) and continuous (except when doing stuff on a digital computer, but that's because the computer has finite storage and so limited resolution).
From a mathematical point of view, though, you get increadibly strange effects things are not quantized. Things are continuous, there is a way of cutting a sphere of gold in two so that both halves have the same volume and same mass as the original. (In real life, gold is quantized into gold atoms so you can't.)
Fluid mechanics treat space, time, and fluids as continuous. But at least the latter isn't, so the entire field is axiomatically no more than an approximation of reality.
Well if reverse-repo's are any indications possibly fucked? but also who knows?