All The Significant Inventions/Discoveries Were Long Ago

Discussion in 'Science' started by impermanence, Jul 7, 2022.

  1. Mushroom

    Mushroom Well-Known Member

    Joined:
    Jul 13, 2009
    Messages:
    12,553
    Likes Received:
    2,454
    Trophy Points:
    113
    Gender:
    Male
    It made no sense to begin with. You think that a PET connected to a Pi is... the same as an SGI in the 1990's and early 2000's?

    Do you even know what you are trying to say? Because so far it has not made a lick of sense.
     
  2. WillReadmore

    WillReadmore Well-Known Member

    Joined:
    Nov 21, 2013
    Messages:
    59,926
    Likes Received:
    16,454
    Trophy Points:
    113
    I think you proved my point.

    The hardware was chosen, because their software requirements had performance criteria.
     
  3. Grey Matter

    Grey Matter Well-Known Member Donor

    Joined:
    Feb 15, 2020
    Messages:
    4,429
    Likes Received:
    2,590
    Trophy Points:
    113
    Gender:
    Male
    They're both good but Dealers of Lightning is better.
     
  4. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,697
    Likes Received:
    3,729
    Trophy Points:
    113
    Yeah. I get that. Maybe try fewer assumptions about my age or what you think I have assumed and just read. I mean I got a back handed concession from you about hardware, but you clearly don't understand the point.

    This discussion is about scientific discovery. In specific the topic suggested is that there has been a lack of it. The software didn't discover the hardware necessary to support it. It can only economize the use of what's available.

    Autodesk didn't produce revit to drive the development of server architecture. It was server architecture that allowed them to produce a collaborative product. Adobe didn't make creative cloud suite and then some research team decided they better get working on fiber data transmission, 8 colorspace peripheral devices, and 8k monitors.

    And yes the $100,000 fully tricked out octane is a 60 lb boat anchor compared to the technology in my 3 lb $700 Lenovo. It wasn't Netscape navigator that made that happen.
     
  5. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,697
    Likes Received:
    3,729
    Trophy Points:
    113
    On it
     
  6. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,697
    Likes Received:
    3,729
    Trophy Points:
    113
    Blizzard made warcraft and some egghead scientist thought, you know what this needs? It needs a laser that can etch a 5nm line in mask, a new way to dope n&p type channels, a new way to coat oxides on silicon, so that we can produce surface deposited MOSFETs in a fraction of the space of TTL so that I can finally join that cyclops raid.

    No that didn't happen.
     
    Last edited: Jul 30, 2022
  7. Mushroom

    Mushroom Well-Known Member

    Joined:
    Jul 13, 2009
    Messages:
    12,553
    Likes Received:
    2,454
    Trophy Points:
    113
    Gender:
    Male
    For the first 30 or so years of modern computers, the same company was making both the hardware and software. IBM, Univac, and all the others were made at the same time by the same company. However, that started to change by the mid-1970s. Where you had the computer made first, but really no operating system at all. The MITS Altair 8800 is a good example of this, there was no "Operating System", the instructions were input using machine language. However, a bunch of college students realized the shortcoming of this, and created a BASIC compiler that also worked as an operating system. And it was so popular that for almost a decade the largest selling BASIC and OS were made by that company.

    Apple, IBM, Atari, Texas Instruments, Commodore, TRS-80, almost all of the 8 bit era used Microsoft Basic. And a version customized for the machine and included (generally in hardware) on release.

    However, by the mid-1980s that started to change. The OS stopped being put into hardware, and was now software that could be loaded separately (like the original MS Basic for the Altair). And as the OS at that point was more of a shell that operated in a text mode, it did not care if the CPU was 8, 16, or 32 bit. That did not start to matter really until we moved to Graphic Operating Systems.

    But the jump to 64 bit was very different. The hardware was out before the OS, mostly because the OS was delayed by 2 years for home users. But for servers, the CPU and NOS came out at almost the same time. And it could still be used, with one of their Server NOS, or one of many variants of UNIX. But once again, UNIX is not a user level OS, and never will be. For 64 bit to move finally to the masses, they needed a user level OS that could work with it.

    We do already have 128 bit hardware, but it is highly specialized. Many graphics chips are already 128 bit, but that works behind the CPU and that is only internal, all data is sent to and from the CPU as 64 bit code. Not unlike say the 8086, which was an 8 bit 8088 pin compatible CPU, but was a 16 bit processor. There is not a thing stopping AMD or Intel from releasing a 128 bit CPU, but why? There is no OS for it yet, and no software. And with the desktop market shrinking more every year, it will likely be a decade or more before either of them or Microsoft makes the effort to do it.

    And to drive even more, the 80386 came out in 1985. But it was not a major player in the computer industry for many years. Because even though it was much faster than the 8086-80286, there was simply no reason for most people to buy one. There were few programs that were written that required a 32 bit processor, so it largely was only used in businesses or with the "bleeding edge" crowd (so the creation of a new world in Civilization took only 1 minute to form instead of 4 minutes). Until 1990, when Windows 3 was released. That could actually take advantage of the advanced features of the 80386. But ironically, Windows 3 would still run on an XT class system. It was Win95 that put the final nail in the coffin of the XT-286 systems. 14 years after the first PC, and 13 years after the 286. Because for the first time since the first 80386 processor came out, there was an OS that absolutely needed a 32 bit CPU.

    And I don't even want to touch on all of the XT-286 to 386/486 upgrade solutions that were out in that era.

    It was only after Windows 3 (and then 3.1 in 1992) that the demand for the 386 finally started to take off. Because for the first time you had an OS that could take advantage of all the features of the CPU had to offer. This is why when I worked at Hughes Aerospace in their first Corporate Computer Rollout, we were pulling a hell of a lot of XT and 286 systems out. They were all that was needed for the job, so they kept using them. And even though the Pentium had been out for years, most we installed were 486, because that was all they needed. On the corporate level they had just moved to NT 3.51, but a lot of the software was still DOS based.
     
  8. Grey Matter

    Grey Matter Well-Known Member Donor

    Joined:
    Feb 15, 2020
    Messages:
    4,429
    Likes Received:
    2,590
    Trophy Points:
    113
    Gender:
    Male
    My first computer was an Atari 800XL but all I used it for was to play games. My second was a 386SX and it was the first time I ever really got into doing stuff on a PC and becoming somewhat familiar with the hardware and OS. There wasn't much to it at that time. There was the bios, autoexec.bat, config.sys and himem.sys, win.ini and individual apps' ini files. Super simple stuff. None of the stuff I've ever done on a PC has ever really challenged it in terms of processing capabilities that have been noticeable to me with just a couple of exceptions. For my M.Eng. I wrote a program to control an HP GC in Borland TurboVision. The compile time on my 1990 vintage 386SX for this app reached about 45 minutes and one of my friends talked me into upgrading to a Pentium. I did not quite get the deal on the Pentium that I had gotten on the 386SX. The 386SX deal that I got was, I suspect, the deal that put DAK out of business.

    Here is a link that seems to have all the details of the hardware I acquired when I bought the BSR 386SX from DAK in August of 1990, https://ancientelectronics.wordpress.com/tag/bsr-computer/

    And, amazingly, here is a link to the software bundle that was very similar to the 1990 deal I got, https://techmonitor.ai/technology/dak_industries_improves_super_software_bundle

    Anyway, when I upgraded to the Pentium my compile time on my GC app dropped under five minutes.

    I just this past month upgraded to a new PC that is the most I've ever spent on a PC ~$1500.

    I've got a copy Visual Studio Community installed and it compiles the app I'm currently working on to hack apart Honeywell DCS xml configuration files in seconds. The program itself churns through about 3000 xml files in seconds. If I can finish it, I'll be able to configure Honeywell Experion C300 CMs using MS Office just as I've been doing for TDC configuration for almost 30 years.

    It's an i7 with the OS installed on Disk 1, an ssd.

    upload_2022-7-30_19-14-30.png
     
  9. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,697
    Likes Received:
    3,729
    Trophy Points:
    113
    Additionally I wouldn't say that the lines for the latest iPhone release are so rediculous because the app store is so awesome. It's because they want the latest camera that can live stream their socials in 5g
     
    Last edited: Jul 30, 2022
  10. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,697
    Likes Received:
    3,729
    Trophy Points:
    113
    I'm not super familiar with Honeywell. I've played with Siemens, Rockwell(Allen-Bradley), Fanuc, ABB and Festo.
     
  11. Mushroom

    Mushroom Well-Known Member

    Joined:
    Jul 13, 2009
    Messages:
    12,553
    Likes Received:
    2,454
    Trophy Points:
    113
    Gender:
    Male
    SGI was the server of choice for the WWW for almost a decade. It was only when the blade server came into dominance that they were finally pushed out of that role. But in the early days, a single SGI could do the work of over a dozen standard servers.

    And it was also the computer of choice for graphics for the same reason (especially video compositing and editing). But by the 2010s, the PC had finally passed the SGI even for video editing, so the days of the company were numbered. Pixar was the last major user of SGI systems, but that ended in 2012. Cars 2 was the last Disney-Pixar movie made with SGI computers, from Brave on it was all Intel systems.

    The current version of the Pixar software was designed to work with 2 16 core Intel processors. SGI was still around, but Pixar did not even bother and moved to the X86-64 platform. But from 1995 until 2012, that is a hell of a long run in the computer industry.

    But tell me, how long would it take your $700 Chinese made computer to render the kind of video Disney was doing?
     
  12. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,697
    Likes Received:
    3,729
    Trophy Points:
    113
    Well, since Toy Story's individual frames could take up to 30 hours to render, pretty sure I've got it beat. Of course, I have the benefit of ray tracing, and they didn't at the time. I also have a solid state drive and DDR4. Should I fire up blender and give it a go? Or would you concede current tech has scsi arrays and SDRAM beat?

    I tell you what, I'll fire up VM and tie two cores behind my back.
     
    Last edited: Jul 30, 2022
  13. Mushroom

    Mushroom Well-Known Member

    Joined:
    Jul 13, 2009
    Messages:
    12,553
    Likes Received:
    2,454
    Trophy Points:
    113
    Gender:
    Male
    Well, my "First Computer" was actually an IBM System/360 mainframe. I was using those and PDP systems long before I could afford my own computer.

    My first owned was a C-64. But that was mostly for games for me and the kids, as I was working on IBM desktops at work all the damned time.

    My first PC I remember was a Franklin PC-8000 in 1989. Traded that in 1991 for a clone 286, did a bunch of work at a computer store for a 386, got my 486DX-50 in 1992. Been through so damned many systems now I could not even hope to count them all, all built myself not including laptops.

    But I well remember DAK Industries. Had a Bone Fone back in the day, and most in the industry consider them the largest reason for the explosion of the CD-ROM, and the incredible package deals. I knew where their "Factory Store" was in Canoga Park, and often visited there. Even made a bundle picking up 10 of their BSR 6800MX CD-ROM units for I want to say $200, and selling them for $300 in the Bay Area (in 1992 $300 for a CD-ROM was a hell of a price).

    [​IMG]
     
    Grey Matter likes this.
  14. Mushroom

    Mushroom Well-Known Member

    Joined:
    Jul 13, 2009
    Messages:
    12,553
    Likes Received:
    2,454
    Trophy Points:
    113
    Gender:
    Male
    Oh really? Are you aware that the data needed to make a single frame was over 300 megabytes? That comes out to around 7.2 GB of data per second (like all early Pixar movies it was composited for 24 frames per second). For a total of around 34 terabytes of data being rendered.

    Let's see how long that would take your computer to do that. You may indeed have that beat, but by nowhere near what you think. Why do you think their current system uses two Xeon 16 core processors? They are not just taking some video shot at 1080p and running it through Adobe Premiere or VSDC. Even today with that computer, it is still taking them around 24 hours to render a single frame. They are rather tight about the current amount of data per frame, but it is significantly higher than it was 27 years ago.

    And remember, that was in 1994-1995. When the "Best of the Line" home computer was the Pentium-90, PCI was the top of the line for BUS, and 4-8 MB of 70 pin SIMM was largely your choices of RAM. How long do you think that would have taken on a system like that?
     
  15. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,697
    Likes Received:
    3,729
    Trophy Points:
    113
    So you're saying sgi locked up shop and went the way of the dodo because their system was too good?

    You said yourself it couldn't keep up. Did technology get worse after they closed up shop? Did Disney only have one sgi working the rendering?

    Let's be real here. It was good then. It's no good now. Scientific discovery is why it's no good. Not because after effects takes better advantage of technology that already existed in 1995...
     
  16. Grey Matter

    Grey Matter Well-Known Member Donor

    Joined:
    Feb 15, 2020
    Messages:
    4,429
    Likes Received:
    2,590
    Trophy Points:
    113
    Gender:
    Male
    I've had the opportunity to become familiar with several industrial control platforms over the years at various levels of detail.

    Once upon a time I was developing serious expertise with Allen-Bradley PLCs. PLC5s and SLCs mostly. Programming and hardware. Ungodly difficult some of the software contortions required just to setup an analog input card on these PLCs. Compared to a Honeywell HPM configuration it was just daft retardedness dealing with AB's block transfer bullshit. Fortunately, at the time I wasn't aware of how unnecessarily difficult it was to program AB PLCs. Or even Honeywell's stuff. Now, being old and frosty about this stuff I can honestly say it all sucks and I love to hate it and battle it and make it work. I feel like maybe there is an Emperor smiling down upon me pleased that my anger gives me power over this motley assortment of industrial control stuff.

    Emerson DeltaV I've had an opportunity to use in upgrading a cogen plant.
    This platform is easily hands down my favorite software suite for controls.
    Although I've reviewed some configurations using it that have misused its abilities to the point that I couldn't figure out wtf the CMs were doing do to some abstraction layering that I fortunately did not have to figure out.

    Programmed a deluge system once on the Modicon network paradigm, that was super fun.

    Got to code an app on an HP VEE system that was pretty fun too once.

    That was actually super cool. Worked with a dude that had specialized a master's in material science who was working on a corporate backed experiment to determine the failure modes of wrinkle-bent gas pipelines. Apparently there are or were lots of gas pipelines constructed where at every turn of the pipe they would more or less manhandle the pipe by "wrinkle-bending" it to the required curvature which I think may have been limited to 90° max. I think it was done something more judiciously than what I was just thinking. Probably a known number of inner wrinkles were required per degree of bend and they must have had a stout set of braces they used to achieve this. And the deformations to the pipeline microstructure was known by analysis of actual samples as well as by the math of the known stress characteristics of the material. However, there was a question on the table about how many pipeline surges would typically have to occur to cause a wrinkle bend to fail.

    Now began the fun. This guy developed a water hammer test station and was going to code the app for it using HP VEE. This is not industrial control software, it's lab software that ran on a laptop and controlled IO using a connected lab grade IO board. I think it connected through a serial port, but I don't remember for sure. Anyway, his position with the company had been cut and they had given him an extension to finish this test and he had planned on doing the code himself, but I ended up helping him by taking over the code. Emailed him the final version of the app and he pounded the test sample until failure which happened all but instantaneously. He was testing it with a water hammer, so the correlation to compressible gas surges was likely just about impossible, but he had fun with it and so did I.

    Cool stuff....
     
    Fangbeer likes this.
  17. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,697
    Likes Received:
    3,729
    Trophy Points:
    113
    A little deep dive...

    The Oracle of Google says Pixar used 117 SPARC station 20's for rendering. The sgi along with a sun array were used as file servers. They developed their own rendering software. Do you think they wondered if someone would be kind enough to develop hardware for the software they wrote for themselves?
     
  18. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,697
    Likes Received:
    3,729
    Trophy Points:
    113
    Just curious, where was the failure point? Was there spalling from cavitation around the crinkle?
     
  19. WillReadmore

    WillReadmore Well-Known Member

    Joined:
    Nov 21, 2013
    Messages:
    59,926
    Likes Received:
    16,454
    Trophy Points:
    113
    That was an interpreter, not a compiler. And, while it did give access to the hardware features of the machine, such as memory, disk, etc., it was not like an operating system.
    No, software was not independent of bit count at that time.
    Selling a 64bit system to end users without the massive amount of third party software, such as the wide range of drivers for printers, scanners, tape, external hard drives, and other devices, that end users owned was not considered a possible direction.

    But, that's close enough for a board such as this.
    This may be the view for some.
     
    Last edited: Jul 30, 2022
  20. WillReadmore

    WillReadmore Well-Known Member

    Joined:
    Nov 21, 2013
    Messages:
    59,926
    Likes Received:
    16,454
    Trophy Points:
    113
    I don't know what Pixar did, but before that productions bought what hardware/software was required for the animations they wanted. After the film was done, there wasn't some home for that hardware. That is, it wasn't viewed as a long term corporate investment of the production company.
     
  21. Grey Matter

    Grey Matter Well-Known Member Donor

    Joined:
    Feb 15, 2020
    Messages:
    4,429
    Likes Received:
    2,590
    Trophy Points:
    113
    Gender:
    Male
    I can't quite remember but I think the Pentium box I bought in 95 probably had a CD ROM drive. This box got a free replacement CPU due to the FDIV debacle.

    I think my next box after this one was a discount deal on a Toshiba that came with NT for the OS and a SCSI bus.
    I ended up buying a SCSI CD-R/W drive for it and ripped a bunch of music CDs to PCM and was able to burn playable CDs with it.
    Oh man, that was such a thrill being able to mix and burn a CD in way less than real time.

    I had goofed around with being a mobile DJ from around 90 to 95 and one of the guys I worked for had bought a Phillips CD copier for around $5k or $6k back around 92 or 93.
    In 99 I was able to copy CDs for the price of that SCSI drive and I was super psyched about being able to do it.

    Interestingly, to this day I seem to think that that SCSI bus did a far better job of ripping the SCSI CD data to the SCSI HD than my next box's SATA drives were able to do.
    There may have been a box inbetween, but I've only built a box once from components and it turned out great, surprisingly.... on XP.
    Still my favorite version of the Windows OS, easily.
     
  22. Grey Matter

    Grey Matter Well-Known Member Donor

    Joined:
    Feb 15, 2020
    Messages:
    4,429
    Likes Received:
    2,590
    Trophy Points:
    113
    Gender:
    Male
    Spalling on the interior of the pipe due to cavitation induced by the water hammer testing!?

    He may have accounted for that but honestly it is only now as I write that post that I realized how flawed his water hammer test was to determine the frequency of failures due to compressible gas pressure waves. I doubt he even looked for it or even cared since Columbia was giving him the boot anyway after it was finished. He was kinda rolling along with being a bit of a redneck about getting this done for the company and I think his biggest motivation was just to cycle this crazy system to destruction. I don't really know a better description to use than that he was rednecking it with the corporate overlords.

    I seriously suspect he submitted the test plan as a hail mary type of joke since it likely came about around the same time that downsizing convos were circulating.

    The sample was a super mild bend, maybe 20°.
    Big diameter though, I think it was at least a 2' pipe, might have been 3.

    He had installed something like 20 or more moisture / water pad probes on the external outer and inner edges of the bend and just one of these signals shutdown the test.
    Super easy programming, no voting required. And he said it literally shutdown in seconds. I think the nature of this test was grossly inapplicable and doubt very much that spalling due to cavitation had time to occur.
     
  23. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,697
    Likes Received:
    3,729
    Trophy Points:
    113
    Maybe I'm confused about what a water hammer is? I thought that happened when flow was interrupted rapidly.

    When it happens in a pump it tends to rip chunks out of the walls.
     
    Last edited: Jul 30, 2022
  24. Grey Matter

    Grey Matter Well-Known Member Donor

    Joined:
    Feb 15, 2020
    Messages:
    4,429
    Likes Received:
    2,590
    Trophy Points:
    113
    Gender:
    Male
    Right, I could be the one not being specific enough with the terminology. I think water hammer generally is used to describe the effect of shutting off water flow such that due to volume of flow coupled with the pressure of flow it induces undesirable pressure waves backwards from the point at which the flow is stopped abruptly. Pump cavitation is very similar but also very different, if that makes any sense at all, which it certainly doesn't and maybe because I've had a couple of too many beers.

    upload_2022-7-30_23-16-26.jpeg
     
  25. Fangbeer

    Fangbeer Well-Known Member Past Donor

    Joined:
    Apr 13, 2011
    Messages:
    10,697
    Likes Received:
    3,729
    Trophy Points:
    113
    Don't sweat it. Fluid dynamics is spooky and I don't really understand it.

    My thought was the rough surface of the wrinkle would create a low pressure area, especially around the inside of a turn that would already produce a pressure drop (like a resistor in a circuit) If the hammer occurs after the bend I thought maybe a low pressure wave that precedes the high pressure wave might have contributed to the failure. But admittedly I have no idea how that actually works.
     
    Grey Matter likes this.

Share This Page