It's the foil lining in the walls which causes the biggest headaches I reckon. Usually when you look at the detailed readout on the wireless status you'll find there's something like 87% signal strength...but about 12% signal quality courtesy of all the reflections. Keeping the wireless base stations at the extreme ends of the house seems to work far better than centrally though.
I would like to try setting up a proper mesh network sometime and see if that worked. They use far lower power base stations, but with a whole load of them dotted around the house, one in each room if you wanted. Sadly at a few hundred quid it's not something I'm willing to just take a punt on. Routing network cabling is far cheaper...and if I'm honest, once you've got it all loomed up and have all the cable flags on the ends at the switch...quite satisfying.
Downstairs is more of a headache though as there's nowhere really to drop cables down. The wall void stops at floor level because of how the place was constructed. My current plan to get a feed down to the lounge (mainly so we can get a wired connection to the plethora of games consoles Chris has down there) is probably to drop a cable run down through the floor of the Annexe in my room down into the garage. The previous owners never capped off the holes they made in the wall in there when they fitted some additional mains sockets in the lounge...so getting at those is easy. There's a double socket buried behind the sofa we don't use so I'm thinking split that into a single socket and a single network point. I'm not messing about routing individual lines back upstairs for each console, I've got a few of those little five port switches floating around. I'll just stick one of those in the cabinet and tie them all in to that. By the very nature of what they are it's unlikely more than one will ever be active at any one time. That's exactly the sort of application those tiny little switches are ideal for.
Got the new graphics card installed this afternoon. I've had it a couple of years now but still can't quite get enough of how well thought out this machine is. It took longer to unplug things to put the machine on the desk than it did to swap the card.
Wider view...
Quick tour for anyone who's not seen inside a Mac Pro 1.1/2.1 before. The large rectangular box at the top left is the 5.25" drive cage. It pulls straight out towards you, no screws or anything. Only fiddly bit is threading the data cable out as it's *just* old enough that the stock optical drive is IDE rather than SATA. I've never bothered changing it as it works just fine.
Upper right is the power supply. Getting that out is a bit if a chore but they're pretty bulletproof by all accounts and rated to 980W so more than man enough for most applications.
Below that there are four 3.5" SATA hard drive bays. These slot in end on, and are unlocked when you open the case. The drives are held on to the caddy's with thumbscrews. Drive change takes less than a minute.
The huge grey rectangle at the lower left houses two hefty 120mm fans (3A apiece - and hovercraft noise levels flat out!), The lower one is ducted through the CPU heatsinks and then over the memory, the upper one blows straight into the expansion bay (the only visible void in the whole system).
CPUs live under the slightly more shiny metal cover roughly in the centre of the lowest level. The memory (all 64Gb of it) is fitted to daughter boards which slot in at the lower right. Then finally you can just see a bit of the fourth 120mm fan, that one used to pull exhaust air out.
If all the fans are running hard it's like sitting next to a car with a viscous fan and a seized fan clutch. However the thermal design is such that you are rarely aware of the fans...and all being large means they rarely have to spin quickly to shift ample air. The CPU heatsinks aren't exactly insubstantial either...
With everything back together I then had to make a run out to get some cables. The previous card had 1x DVI-D and two Mini Displayport sockets. I'd had one monitor hooked up through a DVI to HDMI adaptor and the other two via Mini DP to HDMI ones. The new card has 3x Displayport, 1x HDMI and one DVI-D. I was interested to discover that the DVI connected monitor was actually hooked up with a DVI to HDMI cable...then plugged into a HDMI to DVI connector at the monitor end (that one only has VGA, DVI or Displayport connections, no HDMI). Realised quite quickly it was because I didn't have any DVI leads long enough. The other two monitors are HDMI or VGA only...so I went out and picked up 1 decently long Displayport cable for the left monitor as it is the one which natively supports the standard, one Displayport to HDMI adaptor for the right display, and the middle one could have the one HDMI socket. Sorted. I also dug out from my own stash a DVI to VGA adaptor as I wanted to make use of a fourth output.
I got a nice surprise when I booted the system back up.
When you buy bits of random secondhand computer hardware from the likes of Cash Converters or CEX sometimes you lose. Sometimes however you very much win.
Here's a photo of the NVidia control panel...note the top line and spot the difference.
Yep...while it was labelled up in the shop as. 3Gb card it's actually the 6Gb version. This is a double win (aside from the £60 price difference!) as the 6Gb card actually has a faster GPU than the 3Gb version.
The numbers surrounding GPUs made my head spin a bit. This is graphics accelerator card which has a GPU onboard with can crunch math at rates around four teraflops...Yes. TERAflops. That breaks my brain just a little bit. This is a card from 2016 too!
I'm not an avid gamer, the main reason I was in the market was that I'd been starting to have issues with the old ATI card, but generally like to future proof things if I can. Realistically this is probably the last upgrade this machine will see - with the possible exception of an SSD at some point if I stumble across one cheap enough - as at the end of the day it's a machine from 2006, albeit a very expensive one when new.
Having said that though, the difference in the couple of games I tested (the only one I do play with any regularity is Minecraft in creative mode as it's like having an infinite bucket of Lego) is absolutely staggering. Seeing 50+FPS out of a 14 year old computer from 2010-15 era games with all the options set to "make it pretty" I was not expecting.
What I am really curious to see though is how big a jump I see on the distributed computing work. I know that at least one of the projects I'm doing work for has the facility to utilise CUDA equipped GPUs for computing...so curious to see how than pans out.
My last task for the day though was implementing monitor number four. Did I *need* a fourth one? Absolutely not. I reckon so long as they're a good size, three is probably a good number. The right hand two of mine are 24", the left is 23" if I remember rightly. Big enough you can comfortably tile two windows on each in a portrait orientation anyway. The main thing I'd have liked a fourth one for would be to just have a system resource monitor left open. I don't need it, but my sense of order likes to know what's going on. Plus it looks cool...
I'd be lying through my teeth if I didn't admit to being tempted to add a fifth on the other side of the clock to balance things out... though I lack a matching monitor.
That little very, very yellowed Iiyama one was part of my workstation when I first hit the working world back in 2005. Was rescued (with permission) when we moved buildings and it was destined for the skip. It's horribly yellowed and has image retention issues. However it will do just fine for this application... I'm torn between trying to Retrobrite it or just paint it.