No, Christine, your daughter didn’t see angels

Gotta love what comes out of the mouths of children at times. Some of it is comedic, some of it is sickening. And some of it is just deserving of a facepalm. A lot of it we don’t question.

So what should be the response when a child says she saw angels in her room?

Mommy! Guess what happened last night? I saw three ANGELS!!! I woke up last night and there they were…watching me! One was tall and skinny and she was up on the curtains, another was on the other side of the curtains and she was smiling like the skinny angel. Then there was a little one that was kind of chubby sitting with her legs criss-cross applesauce on my stuffed animal bin! She was so cute, mom! She had a little look like she was going to giggle. They were all white but you could see through them. They were all looking at me and kept smiling, and I smiled back. Then, when I woke up this morning, the angels were gone.

This is from a piece by Christine Carter over on the Huffington Post called “For those who don’t believe“. She initially shared it on her own blog a pinch over three years ago.

Children can have wild imaginations at times. Their minds are very malleable, to say the least, to the point where children can be coached to say things as truth that never actually happened. The reasons behind it are many, but in Carter’s case, the reasons are quite clear:

My daughter has been told countless times, how her Father in Heaven loves and cherishes His beloved children. Much of her life has been immersed in learning of His Grace and His Glory. I have shared precious details of His Hand guiding every step of her glorious and miraculous journey. She has grown to believe.

She has grown to believe. And the statement following this paragraph is quite simply, “And how she has seen his Angels”.

No, Christine, your daughter did not see angels. Children’s imaginations run wild at times, and they will eventually learn to tell the difference between what is real and what their minds manufactured. You assert toward the end of your article that “There will come a time, when she will remember that night long ago, when she saw three angels smiling at her.” And what will she say? You hope that she will look back upon it as a reminder about her former faith. Or she may look back upon it and think “I was a child then and I had a wild imagination”.

Depending on how old she was when this happened, she may not remember it at all. And as memories are imperfect, wildly so at times, your attempt to get her to write down her experience was only step 1 of many to not remembering what she saw accurately. For example, I know that you didn’t reproduce what your daughter said completely accurately as well. You were in the ballpark, but unless you have an audio or video recording, those aren’t her words, but what you thought were her words.

One other thing to bear in mind: the angels your daughter describes aren’t the angels that are described in the Bible. Instead what your daughter imagined, and what many imagine angels to be, is an anthropomorphized version conjured during the Middle Ages and reinforced in mainstream productions including, but not limited to, Touched by an Angel. Personally I prefer the representation of angels in the Diablo series — cloaked warrior figures with celestial wings instead of what you’d normally see on a bird. If angels are real, that’s how I’d like to imagine they look. After all, if that’s actually how angels appear, God’s one hell of a bad-ass.

Instead it sounds like what your daughter described came from a cartoon — in fact, given the description, they actually sound familiar but I can’t quite place it right now.

Here’s a question that I don’t believe you’ve answered: what has your pastor or congregation said about this? Sightings of angels have made entire cities places for pilgrimage by those seeking miraculous transformations in their lives. Has the Vatican or any official of any church visited to talk to you and your daughter and see her bedroom, the place where the sighting supposedly took place? I’m guessing not. Which tells me you don’t sincerely believe your daughter saw angels. Instead you just want to believe she has.

Your daughter “seeing” angels is not the same as angels actually visiting her. A child’s imagination can be quite active at times, manufacturing things the child initially believes to be real. This has been demonstrated numerous times and in numerous ways. I recall a 20/20 special I saw some years back where a group of children were told there was a fox in a box, though there wasn’t, yet the children believe it so much that a child opened the box and said he saw the fox. Again, there was not actually one there. But the children believed it so much that they were afraid of the box and, when one child opened it, claimed to have seen it.

I don’t believe for a moment your daughter saw angels. She thinks she did. You think she did. But that doesn’t mean she actually did.

Beta Orionis – Part XVII: The AX860

Build Log:

I am now thoroughly pissed off at the AX860. I have no fucking clue as to what is going on with this power supply, but when an 800W 80+ rated (not even bronze-rated) power supply is more stable at operating my system than an 860W Platinum-rated power supply, there’s a major problem. And the sad thing: this is the second AX860 unit I’m now replacing.

The 800W power supply to which I’m referring is the Corsair GS800, also known as the “heart lung machine”. And yes, again, the unit I’m now replacing is a replacement itself of my original AX860.

So here’s the setup: overclocked FX-8350 (voltage untouched) to 4.4GHz, two PNY GTX 770s in SLI, pump, 6x120mm and 3x140mm fans, and two 12″ CCFLs. Every evaluation I’ve tried said that 750W should be the bare minimum for what I have, with power ratings for the hardware coming in at about 650W to 675W. The AX860 is an 860W power supply, so it shouldn’t have any problems handling my system. Back in November I wrote about the first power supply that, for some reason, just started acting up, and caused me to replace the power supply and my SATA RAID card. A retail-packaged replacement unit came back on RMA and was not a problem initially.

A couple months later, it started acting up. Somewhat similar symptoms as well to the first, and this would happen randomly: under load the graphics would freeze, the sound would go haywire, and the system would just completely lock up a few seconds later — reset button to the rescue! So first thing I did was change out the PCI-Express cables as initially I had them using the dongled cables to power the graphics cards, so I changed it so each of the power connectors on the graphics cards had a direct line to the power supply. That was sporadic as well. Then I realized that one of the power cables wasn’t entirely seated, but that didn’t completely correct the problem.

Then over this preceding weekend, the system could not remain stable at all. I disconnected the AX860 and hooked up the GS800, and the system has been running completely stable ever since.

Now this is on top of the mainboard also already being replaced. Originally it was the Gigabyte 990FXA-UD3, and now it’s the ASRock 990FX Extreme6. So that makes the power supply the common denominator in the system’s instability. Pretty fucking sad, too, that a platinum-rated 860W power supply couldn’t keep the system stable.

Why not just hook up the GS800 and call it a day since it seems to have no problem remaining stable? I want something with a better efficiency rating and quieter fan for my primary system.

There were two potential replacements in mind: the Fractal Design Newton R3 1000W, which is also a platinum rated power supply, or the gold-rated EVGA Supernova GS 1050W. The latter had the lower price tag and a stellar 9.7/10 rating on JonnyGuru, (the Newton R3 received a 9.4/10) so I ordered one from NewEgg. Until I learned of the Supernova GS, I was going to buy the Newton R3. The radiator on the floor of the case is the reason for that: 180mm is the longest I can go on the power supply, and the shorter the better. The AX860 is 160mm long, the Newton R3 is 165mm, and the Supernova GS is 170mm.

The only way I’d be able to get shorter still would be to go with the 160mm long, bronze-rated Corsair CX850M, or even the CX750M but that’d be cutting it close on the power ceiling. But given that neither PSU has enough PCI-Express connectors to put two separate connectors on each graphics card, neither would be a good option.

But depending on which of the Newton or Supernova I selected, it would mean redoing the entirety of the cable management in my system. I think I’m going to save that for the weekend after it arrives.

Now what to do with the AX860… If anyone from Corsair comes across this, I wouldn’t mind exchanging it for a memory kit or an SSD.

Reply to Dave Ramsey: “4 practical ways to save on pet care”

A couple day ago, Dave Ramsey’s blog (not sure if it’s Ramsey himself) published an article called “Pricey Pets: 4 Practical Ways to Save on Pet Care“. I tried leaving a comment, but it’s still in limbo. Note to blog owners: don’t have a comment section if you’re going to leave comments in moderation limbo.

Anyway, let’s get into this.

1. Pet Food

Just buy a bulk bag of dry dog food and pour it into a bowl. Your dog or cat doesn’t need a fancy feast. They just need food.

While it’s not necessary to go with the most premium pet food available, there is still “junk food” for pets, and going a little premium can pay dividends with the long-term health of your pet. This is especially true when it comes to your pet’s teeth — some of the more premium foods are better at tartar prevention, which can save you from having dental work later or having to use a specialty (read: expensive) food for their teeth and gums. You don’t need to go all-out, but don’t get the cheapest food on the shelf either.

For example, I buy Purina One SmartBlend Indoor Advantage for my 9 1/2 year-old feline.

2. Supplies & Medicine

Dogs don’t require parkas in the winter and sunhats in the summer. God has equipped them with everything they need to enjoy the Great Outdoors au natural—unless of course we’re talking hairless cats.

Sorry, but no. And a cursory glance at the variety of breeds shows how wrong this statement is. For one, winters can be harsh, and if you have a breed that originated from a predominantly warmer climate, that can be problematic. One of my parents’ dogs, Angel, is a blue Australian Cattle dog, Basenji mix. The former came, obviously, from Australia, the latter from central Africa. Both are obviously warm climates. So she gets jacketed in the winter when the temperature or wind chill plummets to the single digits or lower, primarily because she doesn’t have a thick coat.

But that means she can better handle hot summers, unlike my parent’s oldest dog, Rolli. Not only does she have a thick coat, but it’s mostly black. She can handle lower temperatures without a jacket (to an extent). In the summer, though, she’s mostly in the shade.

And while it’s perfectly okay to buy pet toys, don’t get sucked into making your furry friend more comfortable with a memory foam mattress or a deluxe cat tree. That’s what your lap is for.

I’m guessing whoever wrote this doesn’t have cats. Like with food, you don’t need to go all-out, but you need to bear in mind that cats love to perch. So give them plenty of sturdy places to perch and they’ll be happy. I have two shelves in my apartment made purely so my cat has other places to perch up and out of the way, but still be close by. I built them myself, too, so if you do something like that, definitely go that route instead of buying something pre-fab, and buy scrap carpeting from your local home improvement store as well. They sell off the leftovers from the end of carpet rolls at steep discounts — only downside being you might end up with a lot more than you need.

3. Grooming

And when it comes to grooming, skip the overpriced Puppy Palace and shop around. While an occasional summer trim may be in order, there’s no need for specialty ‘dos and luxurious bath products. This is one category where mutts have it made.

Talk to your pet’s veterinarian. Many offer grooming specials, and there may be a discount on grooming if you bundle with your annual or semi-annual exam. Definitely price shop, but also be sure to ask around for recommendations and check reviews online.

4. Vet Care

When it comes to your pet’s health, it’s hard to separate your emotions from your wallet. We all want our animals to be active and healthy, but does that mean prolonging their lives until it bankrupts us? We say no.

If Fluffy has a tumor, get a second opinion and then ask some hard questions. Is surgery absolutely necessary? Will it really help your animal’s quality of life? Or will it just cause her more pain?

If she does require an expensive operation, ask for paid-in-cash discounts, save up for a few months first, or make the tough decision to enjoy the time you have left together. Even if it’s heartbreaking, you must put the well-being of your human family first.

One thing a lot of people don’t realize is that there are people lined up with deep pockets who are willing to fund expensive veterinary care — if it’s worth it in the end. Pet got hit by a car and has a broken leg? Chances are your vet, or a little research on Google, can find a charity or sponsor who can cover the cost in its entirety or on top of what you can reasonably afford.

Also pet insurance is becoming more widespread now, so definitely consider it, especially if you have an older pet. Your veterinarian will likely have details.

Rack mount HDD enclosure, part 7

Build Log:

The enclosure shipped a little earlier than expected. FedEx reports it shipped on Monday from Nova Scotia, Protocase e-mailed me about it on Tuesday, and I received it on Wednesday. So let’s fill in a few gaps in the timeline.

Before discovering and talking to Protocase, I had started to purchase some other parts. From my local Micro Center, I acquired some 3mm “tailed” LEDs to use as activity lights in the case. And I already mentioned I had fans — 4x80mm Enermax fans, which is why I configured the case for 4x80mm fans in the front. I don’t have any filters for them yet — more on that later.

From Performance-PCs I ordered 3x60mm fans for the 60mm fan mounts I had on the rear of the enclsoure. Since I’m going for a reasonably quiet build, the fans I bought were Noctua — who knew they made 60mm fans? — specifically the NF-A6x25. Performance-PCs had them for a good price and, again, I’m wanting to make this reasonably quiet. I also bought holders for the LEDs and a latching vandal-resistant switch for the power switch.

So here’s the parts list:

HDDs: 4x1TB Western Digital Blue in RAID 10
Fans: 3xNoctua NF-A6x25 (60mm)
4xEnermax UC-8EB 80mm
Enclosure: Custom enclosure by Protocase
Port multiplier: Addonics AD5SARM6G
Power supply: Enhance ENP-7025B 250W FlexATX
LEDs: Lamptron 3mm Amber LEDs, 16mm cable with 4xLED-3H-C LED holders
Switch: Lamptron 16mm vandal resistant switch, red ring

There are some parts you can change out to reduce the cost, but not by much. Other 60mm fans are not going to be much less than what I paid for the Noctuas, with the trade-off that they might be louder — I found the Noctuas on special with Performance-PCs for about 9 USD, which tells me they may not restock them when they run out. The Enermax fans were only about 8 USD each from Amazon.

I selected the fans in question because I knew they’d be silent. Noctua is one of the best companies when it comes to quiet fans — others come close, but Noctua continually wins out. The color scheme of the fans is why I didn’t buy 4x80mm Noctuas for the front — that and the price (typically around 15 USD each). At least the 60mm Noctuas aren’t going to be noticeable unless you go looking for them.

Custom enclosure

First my thanks to Protocase for working with me with regard to the original quote. But let’s talk the good, the bad and the ugly with regard to this. And let’s start with a simple question: now that I have the actual case in hand, is there anything I’d change about the design? Three things.

First is the front face. The four holes I had cut for LED holders are a little too close. They’re the perfect diameter (6mm), but just a little too close together for the nuts that hold them in place, so some of the nuts can’t get a firm hold while others can. So easily I’d space them out a little further. Instead for the enclosure as it currently stands, I’ve ordered some 1/4″ IDx5/16″ ODx5/32″ long spacers to provide a little bit of a gap for two of the holders. I’ll see how well they’ll work when they arrive.

DSC_0253

DSC_0254

Second is on the rear of the case with regard to the mounting holes for the FlexATX power supply. I didn’t make them large enough to accommodate a #6-32 screw. I have no idea how I screwed that up, but they’re a few hundreds of an inch too small, but a 5/32″ drill bit easily fixed it. The screw holes lined up perfectly as well, thanks to a diagram published by SilverStone Tek.

DSC_0259

Aside from that, there’s also the cutout I chose for the port multiplier: DB-50. A little research should’ve been done on this first to realize that the port multiplier is dimensioned to a DB-50S cutout, not a DB-50. The difference is that DB-50S is about ½” wider than DB-50. For now, only one of the screws on the back of the port multiplier could be secured. The eSATA port is still accessible, but not in a way I’d be comfortable leaving for the long term. Perhaps I can re-mount the multiplier on a different kind of bracket that will fit in a DB50 port. I’ll look around for options.

Okay one other thing: I don’t have the four front fans quite evenly spaced. Needless to say, before publishing the design file, I’ll have a couple modifications to make.

The finish on the case is high quality, and the case feels very sturdy. The cutouts for the fans fit their respective fans without any problem. The hidden hardware is also a plus. There are embedded screws on the front and bottom panels to be secured down with an 11/32″ nut — use sockets and wrenches where needed to ensure everything stays secure.

Otherwise the enclosure came out exactly as designed and was shipped wrapped in enough bubble wrap to protect a small child. So how did the components fit?

Building the enclosure

The three 60mm fans mounted without any problem. I’m using the noise isolating mounts that came with them as well instead of screws.

DSC_0256

The four 80mm fans in the front did meet some small interference from the lip at the front of the bottom panel. But last week I ordered several packs of Gelid rubber fan mounts, intending on experimenting to see what I can do with them to improve the sound profiles of the various computers I have. And they pad out the fans perfectly on the front panel.

And I discovered the Gelid mounts can be attached to the fan or the panel first. Most vibration-isolating fan mounts require you to mount them to the panel first and then attach the fan to it, which can be problematic in tight corners, such as the front panel of the enclosure. But it wouldn’t have worked anyway as it would’ve been difficult getting the front panel flush with the lip on the bottom panel. The ability to attach the mounts to the fans, then the fans to the front panel was a major bonus — and in small form-factor computer builds, such as microATX or miniITX cases, I can see that making things much, much easier.

DSC_0258

I’ve ordered filters from modDIY as I wanted filters that wouldn’t stand out. I also ordered a power supply jumper switch as well. They ship from Hong Kong, so I can go without them in the mean time. Not sure if I’ll end up practically destroying the fan mounts getting the filters put on, so it’s a good thing I’ve got plenty of them. They’re somewhat difficult to work with as well — pliers will come in quite handy trying to get these installed — so perhaps different vibration isolating mounts are in order.

The jumper switch I ordered since it’s a lit jumper switch, meaning I can just cut the switch off its wires and attach the vandal resistant switch to the PSU adapter. Ideally I’d be soldering it, but I’m not sure if I’m going to do that.

For powering the fans, I purchased an NZXT Grid from my local Micro Center — with 7 fans, something was going to be needed to keep this sane. A SATA power splitter powers two of the hard drives while the two SATA connectors from the PSU power the other two. And the Agestar HDD “bumpers” came in handy this time as well for holding the hard drives — I do still have the CaseLabs HDD brackets, but opted against them for this, in part because I misplaced the hardware for it.

DSC_0262

Right now there’s not much in the way of cable management. Eventually I’ll take some 3M Command clips — I used them in Absinthe and Beta Orionis — and use those to help with the cable management. They work much better than zip ties alone as they will pull and hold the cables out of the way. For the power switch I’m currently using an ATX power switch I’ve used previously to jumper a spare PSU when bleeding the water cooling loops. I have one LED run currently to the eSATA activity light. I intend to have four LEDs total for each of the four individual HDD activity lights.

Until I get all the small parts necessary to finish this up, I do at least have it running inside the enclosure to make sure everything works. I have stuff on the way from two companies to finish this out.

And so far the fans are quiet, comparable to the fans in Beta Orionis at least. And it’ll probably be a little more quiet once I get the cabling better managed. If necessary I do have noise dampening material as well I could put on the sides and top of the enclosure to help curtail noise even more. When I tested the fans without anything else in the enclosure, you’d have to be right up near the enclosure to hear the fans due to the airflow’s clear path. Everything that’s in the enclosure now is creating turbulence, so there’s some noise. I’m not expecting to completely eliminate any noise, but at least have it so that it’s not going to be noticeable.

Benchmarks, bottlenecks, and buyer’s remorse – revisiting AMD vs Intel

I’ve had the wonderful chance over the last couple months to interact with some very arrogant elitists and fanboys. The AMD vs Intel debate to those individuals is over, and the clear and unchallengeable winner is Intel. There was even one person I encountered for whom the words “buyer’s remorse” had likely just entered his vocabulary, as he basically insinuated that anyone who defends the FX-8350 has it, even going so far as to open his post with a link to Wikipedia’s article on post-purchase rationalization — i.e. “buyer’s remorse”:

The 8350 is clearly not the better choice over a locked i5 for gaming, but people cannot stand to believe they’ve made a bad purchase decision. I don’t blame them; it sucks to purchase something and to find out there’s better options out there, but I don’t like how people try to justify their purchases to no end.

Talk about an elitist and arrogant position to take. Better options being available does not make a particular purchase a “bad decision”. Decisions have trade-offs, plain and simple. It doesn’t matter if you’re talking about cars, computers, espresso machines, or what have you. There are tradeoffs going with AMD, and there are tradeoffs going with Intel. There are tradeoffs going with an AMD graphics card, and there are tradeoffs going with nVidia. That doesn’t make either an inherently bad decision, and to act like going with AMD is an inherently bad decision shows the elitist attitude I’ve seen permeate much of this discussion.

On YouTube, there was one Intel fanboy who made this blanket statement: “AMD is one big failure since they made their first product FACT.” Oh brother. I’m guessing this guy thinks the FX processors are AMD’s “first product”. Goes to show no one is immune to dumb comments, regardless of what side you’re on.

AMD and Intel have both been around for a long time. Advanced Micro Devices (AMD) was founded on May 1, 1969, while Intel (derived from “integrated electronics”) was founded on July 18, 1968. In response to the idiot who made the blanket comment, I said this: “Crap since their first product? So tell me your experience with AMD’s 8086 and 8088 processors. Or did those processors come out before you were born?” That wasn’t AMD’s first product, but the point still stands.

In his book The Political Brain: The Role of Emotion in Deciding the Fate of the Nation, Drew Westen said “When reason and emotion collide, emotion invariably wins”. And without any doubt, many throw a lot of emotion at the AMD vs Intel debate and don’t talk rationally on it — and that holds true for both sides.

Let’s explore history a little. For quite a while, AMD and Intel have traded blows. Sure AMD is lagging behind Intel quite a bit right now, having not released anything groundbreaking in several years. Those who want to believe AMD can never do anything right obviously don’t realize that AMD’s pulled quite ahead a few times over the years, and in one case almost left Intel in the dust.

If you’re running the 64-bit version of Windows right now, you actually have AMD to thank on that. AMD developed the x86-64 instruction set targeted by 64-bit versions of Windows. It was renamed AMD64 on release and first implemented with the Opteron processor in April 2003, with the Athlon 64 following in September 2003. It would later become known as x64 to distinguish it from x86. Unlike Intel’s original 64-bit implementation, AMD64 is fully backward compatible to x86, which is why you can run either a 32-bit or 64-bit operating system on x64 processors as if it were a 32-bit processor.

This put Intel in the position of underdog, and basically means you have AMD to thank for the fact 64-bit desktop processors even exist today. But I don’t expect the Intel elitists to actually do that. Wouldn’t surprise me if they’re trying to find any way of disproving what I’ve just said. Intel didn’t release an x64 processor until July 27, 2006, more than a year after the release of the x64 versions of Windows XP Professional and Server 2003, and over 3 years after AMD’s first offering.

Now Intel actually had a 64-bit processor called the Itanium, first released in 2001. It was first announced in 1999, and a nickname was dubbed for it quickly: “Itanic”. Compared to other established options, such as the DEC Alpha, it was a disappointment, and the processor definitely did not live up to the hype, meaning its nickname was quite fitting. The Electronic Engineering Journal tells the story in their article “Sinking the Itanic“. Despite that, the Itanium is still around and still selling, for some reason.

While 64-bit computing has actually been around for quite a long time, AMD brought it to the desktop, and it wasn’t the first race that AMD won either, only the most striking because of the time lag by Intel. AMD was the first to ship a 1 gigahertz desktop processor. While Intel tried to contend they actually beat AMD, evidence suggests AMD beat them to it, and it is generally accepted that AMD won the gigahertz race.

But where AMD has always won out over Intel is price to performance. I don’t believe Intel has ever come close — unless you’re buying your Intel processors used or waiting for price drops to close the gap, but even then that still may not close it enough.

The point is that most trash talkers have likely not used AMD products for long, if at all. It’s a lot like the comments you see on Amazon by buyers who trash-talk an entire company after having a bad experience with one product and cannot understand why people would actually give 4-star or 5-star reviews. AMD has contributed significantly to the current state of desktop computing.

Today they’re lagging behind Intel, but that doesn’t erase the history AMD has made.

Bottlenecking

One thing I’ve seen way too much is the use of bumper sticker thinking by many of those in the “millennial” generation. With AMD vs Intel, this is no different. Arrogant, elitist people with superiority complexes armed with terms and phrases they understand only to the point that the phrases basically mean “Intel rules” take to the web in various forums to shout down anyone who dares speak favorably of AMD.

Which brings me to “bottlenecking” — arguably the most misunderstood and misused term when it comes to talking about computers.

Anyone who is familiar with business processes and process and operational management knows the term. With computing I’ve come to the conclusion that most who use the term don’t understand it. They know only what they’ve read: “AMD CPUs will bottleneck graphics cards”. Since bottleneck sounds like a bad thing, and the ignorant Intel owner is looking for a new reason to feel superior to his AMD counterparts, he automatically thinks that Intel doesn’t bottleneck graphics cards, assumes AMD always bottlenecks graphics cards, and so throws around the word as if it actually means something.

The term bottleneck, of course, comes from the neck of a bottle: the thinner neck greatly inhibits the flow of liquid leaving it. A similar term from military strategy is is “choke point“. In business, a bottleneck is an inefficiency that slows down an entire process and keeps it from operating at peak capacity. It’s a very important concept to understand from a management perspective.

And it applies perfectly to computers. Your system is full of bottlenecks. An Intel processor will not eliminate bottlenecks, so don’t even bother trying to say such.

Storage is easily the most constricting bottleneck for your system. The read and write throughputs will always hold back your system. The fact that CD, DVD and BluRay drives are always significantly slower than even HDDs is why optical drive emulation started coming into vogue about 10 years ago — you can drastically improve game performance by emulating a CD/DVD image instead of using the physical disk. Having an SSD curtails the storage bottleneck even more.

Memory is also a bottleneck. Your processor is always going to be significantly faster than your RAM. Anyone familiar with low-level programming knows that the CPU can perform operations faster on registers than memory. Compiler optimizations take this into account as best as possible, which leads to the common notion that you shouldn’t try to outsmart the compiler, because you cannot do it.

In short any component in your system can be a bottleneck. Having an Intel processor does not change this. Period.

Let’s talk graphics. The common definition of a bottlenecked GPU is one that doesn’t constantly operate at 100% usage while in a game. And with this demonstrably incorrect definition come a number of implications, conveniently always levied against AMD as well. “You’re not getting full performance out of your GPU when using an AMD processor” is the typical argument. The question they never want to examine is whether that actually matters.

Intel elitists always overstate the concern as well. They imply the GPU not operating at 100% usage is a big problem, which it’s not as I’ll show in a moment, while also implying that an AMD system cannot see any kind of performance gain by upgrading to a newer graphics card or to multiple graphics cards, which is demonstrably untrue.

In Beta Orionis, I have two GTX 770s in SLI. Currently I’m working through Bioshock Infinite. In game GPU usage will fluctuate from as low as 30% to nearing or reaching 100% depending on where I am. Many would say that this shows my graphics cards are bottlenecked because their usage never stays at 100%. What they never take into account that the GPU may not need to be used at 100%.

JayzTwoCents made a video back in August 2013 where he demonstrated a bottleneck. He wound down his i7-3770K processor to the equivalent of a 1.8 GHz i3 processor running with a single GTX 680 graphics card and demonstrated what a bottleneck actually looks like. Quite simply it was a high CPU usage but low GPU usage. In the video, he showed GPU usage hovering around 50% while the CPU usage was significantly higher. That’s a bottleneck.

If your GPU usage isn’t maxing out, but your CPU usage also isn’t maxing out, and you’ve got the settings on your game cranked up as high as they’ll go, that’s not a bottleneck. It’s just a game that isn’t challenging your system to its limits. It’s not automatically a bottleneck if your graphics cards aren’t maxed out — yet too many people who don’t understand what bottlenecks actually are will say otherwise. “Well you’re not getting the maximum value of your graphics card if it’s not maxed out”. Well, perhaps it’s not being maxed out because it doesn’t need to be.

Watching frame rates in Bioshock Infinite with FRAPS with the game maxed out on its graphics settings, V-sync off as well, I always get at least 60 frames per second, with extended periods in the triple digits. Talk to an Intel elitist and they’ll probably tell you what I’ve just said is impossible, or they’ll pull a variant of the “pics or it didn’t happen” trope. Watching the CPU usage on my FX-8350 versus GPU usage, the GPU usage never hit 100%, but neither did CPU usage on any core. The game’s engine spread out its processing across the cores, and SLI allowed its graphics processing to be spread out across both GPUs. If I had only one graphics card, I’d expect it to be maxed out, but as I have multiple graphics cards, it isn’t maxed out.

When I disable SLI, the single graphics card is maxed out, and we see the typical fluctuations in usage percentage that would be expected during the course of a game, with the CPUs not having any problem keeping up. So clearly the FX-8350 isn’t being any kind of bottleneck for the GTX 770, either single or dual in SLI. I have no reason to believe it’ll be one for any other graphics card currently on the market.

Which brings me to arguably the largest bottleneck of any gaming system: the game itself, specifically how it’s engineered. While faster hardware can get around poorly designed or implemented software, it’s an uphill battle, possibly requiring significant improvements in hardware to get noticeable results.

Drivers and the operating system can also contribute to or exacerbate any observed bottleneck.

Bottlenecks have numerous causes, numerous points for investigation or solution, yet most Intel elitists will readily assume it’s the processor if it’s learned the person is running AMD.

The biggest reason to analyze bottlenecks is not just to determine the extent to which they exist, but the cost to alleviating them compared to the capacity that is gained. Any money that is going to be spent needs to be balanced by a commensurate increase in value and/or capacity. Even businesses are willing to live with bottlenecks in processes if it’s determined alleviating the bottleneck won’t result in a desirable return.

As such if you already have an AMD FX system, then it makes absolutely no sense changing over to an Intel system, in my opinion. Sure doing so would potentially alleviate a bottleneck, and while the determination of whether the expense is worth the upgrade is entirely up to you, I don’t see it as being worth it. The gains, which could be rather meager, don’t justify the costs. After all, if the gain in performance is one you’re likely to not notice, why undergo the time and expense of changing over? If you have a water cooled system, the expense is greater as you have to change not just the mainboard and CPU (memory can likely be re-used), but the water block as well.

Instead ask yourself this: what kind of performance are you seeking, is your desire reasonable, and can your system deliver it? In my case, the answer is an overwhelming yes. And with most other games I’m going to be playing, I see no reason to not believe that will hold true.

What matters in gaming is the minimum frame rate. So long as your system can deliver at least the refresh rate of your monitor at its highest resolution, you’re golden. And if it can do that with all settings in your game maxed out, even better. And, again, AMD paired with a good graphics card can deliver that.

But again, talk to any Intel elitist and they’ll likely say that is impossible. I’ve seen the comments all over the place.

Benchmarks

Benchmark scores have become little more than tokens of superiority — “yeah, well, my system can break 10000 on 3DMark Fire Strike while yours can’t”.

The problem with benchmarks is how misleading they can be at times. All benchmarks being synthetic, they measure a static block of code — be it an algorithm or 3D scene. This eliminates variables and gives a good base against which performance comparisons can be made. If you’re overclocking, they help determine if you’re gaining ground.

But here’s where many end up getting led astray: they focus only on one or a few scores. I’ve seen it a lot.

Using one benchmark to determine the true performance comparison between platforms is like taking a person’s time in a mile run and using it to extrapolate out to entire marathon. For some, it will be an accurate extrapolation. For most, though, not quite. And the same with benchmarks.

This is why no reputable review site posts only a couple benchmark numbers. They tend to provide a wide range of benchmarks and frame rates. And here is where you learn one key detail: despite claims to the contrary, Intel does not have an absolute advantage over AMD, and where Intel does have the advantage, it’s not as striking as many believe.

This doesn’t stop the Intel elitists, and I’m sure I’ve only just enraged them even more.

TweakTown made a performance comparison of the FX-8350 vs the i7 4930K running GTX 780s in SLI and GTX 980s in SLI at 4K. In all measurements, the AMD processor gave the better scores. Unsurprisingly, the results where challenged by commenters. Many said that they should’ve used the i7 4770K instead. Kenny Rucka Jarvis made a prediction: “They should have compared it to a 4770k or 4790k which would have blown the 8350 out of the water.”

TweakTown did just that, and the results showed the i7 winning out, but it didn’t “blow the 8350 out of the water”. TweakTown had this to say:

Spending an additional $180, or another 50% or so on top of the AMD combo, results in some decent improvements in frame rate. The issue again is, the performance increase is only around ~10% on average, while you’re spending 50% more money. Some games are scaling much better, with improvements of 15-20%, but still – you’re spending $180 more.

Again the performance gain was not that significant in what they tested. While in some games the performance difference may be a bit more pronounced, look at the frame rates — in most cases the AMD processor was quite capable of keeping up with the Intel processor on both minimums and averages. Sure the 4770k won out, but it didn’t “blow the 8350 out of the water”.

In the comments to the 4K challenge involving the 4930K, Facebook commenter Gary Lefever observed “Show an AMD CPU in a positive light and the butthurt Intel warriors come out in droves! You’d think someone was talking about their mothers or something.” Pavan Biliyar responded:

I agree, although in defense of the butthurt, this article is somewhat biased, though not intentionally. All games tested were GPU-bound in addition to running them at really high resolutions and details, and likely single-player– all of which puts very little emphasis on impact of CPU. GPU-bound tends to favor AMD platforms by winning on perf/$, but they also defeat the purpose of upgrading as platforms from several year ago would end up with similar performance, whether we’re talking about a Phenom x6 or Core i5-700. Playing GPU-bound makes upgrade gain percentage per dollar unreasonable.

If you want a reason to upgrade the CPU, you should be playing CPU-bound, and therein lies a simple fix to make the butt hurt happy: CPU-bound scenarios will embarrass AMD every time. Trouble is that CPU-bound doesn’t favor fancy graphics card combinations if it even has proper multi-GPU support. I mean, that’s the whole idea, it’s CPU-bound. Except now we’re accidentally offending those butt hurt over their $350+ graphics setups, we’re not allowed to tell them they spent too much.

Of course getting more than one display can compensate and give those expensive graphics something to do, all while justifying an Intel platform– but wait, now we’re getting into a demographic that wouldn’t settle on AMD because it isn’t like they are strapped for cash. Making the comparison at this stage doesn’t make much sense. Although, I’d like to see a surround 4K review by Anthony Garreffa testing both platforms with maybe more graphics cards to compensate. Frame rates may be unplayable, but I’m more interested in how each platforms scales for their total price.

That being said, the rich stay rich because they are picky with their money, they don’t take the ‘spared no expense’ route– except when they go out and get Apple laptops. The majority of any salary bracket aren’t enthusiasts.

And that’s certainly a very striking observation. The majority of people who build a computer don’t care about anything more than getting what they need at the right price, and the question will come down to what will meet their needs. Benchmarks won’t tell you whether something will meet your needs, as benchmarks are merely performance comparisons involving static blocks of code, and so can lead people astray or cause them to overspend by leaps and bounds, meaning they’re not getting the most for their money.

Gamers are probably going to be a little more involved in their purchase decisions, but most computer buyers aren’t. Enthusiast gamers are the ones to avoid, in my opinion, as most that I’ve encountered don’t have any capability of thinking with real costs in mind. I said this on the Linus TechTips forum:

I’m not going for super ultra-high resolutions with framerates faster than your eye can see, let alone what my monitors can actually display. Obviously if you’re going for that, you’re not going to be running AMD, but you’re also not going to be running a 3rd generation Intel, and probably not even a 4th generation. You’re probably going to have a 5th generation Intel with multiple 980s.

Later in that post I followed up by saying “So the question comes down to what performance are you seeking, and can your system deliver it? If no, then figure out what to upgrade.”

The additional question is why you’re seeking that kind of performance. Are you merely trying to improve benchmark scores, or is your system no longer capable of delivering what you actually need, not what you think you need? Are you trying to compete with others purely on the egotistical notion of being able to brag about your system and the benchmarks and frame rates it can achieve, or can you save money and actually buy a system capable of delivering a decent experience that won’t send your bank account into the red or max out credit cards?

Conclusions

Now many Intel enthusiasts, or “Intel warriors” as mentioned earlier, will probably look at all of this and call it one giant case of “buyer’s remorse”: “Wow, you wrote all of that. That’s a lot of guilt over a bad purchase. Stop trying to defend your bad choices.” Creationists have said similar about those who defend evolution, calling it “going to a lot of effort to disprove God”. Many Intel enthusiasts have also adopted the point of view that it doesn’t matter what AMD puts out, it’s crap and should be avoided at all costs — similar to how many anti-vaxxers will almost automatically be against anything labeled a “vaccine”.

Or they might cop out and say “well you’re experience isn’t typical”, but the only thing atypical about my setup is the SLI configuration — that and the fact I’m using a 32″ television for a monitor (and it works quite well!).

The thing is that Intel doesn’t provide significant performance gains over AMD in virtually every measurement I’ve seen. It’s better, but not so significantly better as to, in my opinion, justify the cost. And an AMD FX processor will not bottleneck a graphics card!

Back in June of last year, I said this to a friend of mine on Facebook:

It’s like gamers saying “Yeah well, my system can do BF4 at 150fps, which demolishes yours which can only do 80fps”. Okay…. but can you tell the difference between 150fps and 80fps, or will both appear to be smooth renderings to the casual observer?

The AMD vs Intel gaming debate is just one giant dick-measuring contest, in which benchmarks and frame rates are substitutes for inches in length or girth, and claims that AMD will “bottleneck graphics cards” or is otherwise substandard are the substitute for insinuating a guy has a small prick, or can’t get a woman off, or what have you…

Whether a particular need can be adequately met by an AMD CPU is ignored in favor of pointing out that Intel can do it better. “I can get 150fps in Battlefield 4 while you can only get 80fps” is the equivalent of saying “I’ll be she can cum a lot harder with my cock!” Yes, I’m intentionally making sexual remarks to show the absurdity and irrelevance of this whole discussion and the degree to which it’s blown out of proportion. The discussion has long ago lost any sense of rationality or sanity and has turned, in essence, into a substitute for competitions over prick size and sexual ability. (Perhaps that’s why I have no qualms going with AMD, as I have no problems satisfying my wife…)

Does it ultimately matter that Intel can do it better? If you answer that question in the affirmative, you need to rethink your perspective. For most, it won’t matter. For the relative few, they’re already aware it matters, and they already have other requirements that can only be met with higher-end hardware.

As I said on the Linus TechTips forum, quoted above, if you want to be able to play every game on the market maxed out at the highest resolutions, you’re not running an AMD processor. You’re probably not even running a 4th generation Intel (4xxx series i5 or i7). You’re probably running the i7-5960X in a system with two or three GTX 980s, and you’re probably waiting with eager anticipation the release of the Titan X and Broadwell — and everything is probably water-cooled and overclocked as far as you can take it.

In other words, if you’re a performance enthusiast, money is likely no object. And when money is no object, there is no competition possible. It’s a notion that we see time and time and time again, not just with computing but with everything else in life.

For everyone else, AMD is a viable option, so don’t overlook them. Yes the processor is a couple years old, but it’s still quite a contender. For gamers the GPU matters more anyway and, again, an FX processor will not bottleneck a graphics card — but that won’t stop elitists from continually saying it will.

Reply to Denise Ngo about “good guys”

Let’s play a game called “how many women can I piss off with this article?” Prompting this latest match is the article on Yahoo! Style called “11 Things a Good Guy Will Never Ask You to Do” by Denise Ngo. I may skip over sections with which I hold agreement for the sake of brevity — especially since on the other sections I’m bound to not be brief. Let’s begin.

1. His Laundry

Pairing your socks isn’t exactly our idea of a stay-at-home date, nor does living with you make us a 1950s housewife. A good compromise is for one partner to sort and start the laundry and the other to fold and put it away. Plus, studies show that helping around the house increases a man’s chances of getting laid. So, how ‘bout that pile of dishes? If you clean up the sink while we tackle the living room, we’ll get to the bedroom twice as fast.

Compromise is whatever you and your husband or significant other make it out to be. Compromise is a necessary part of any relationship, and dividing chores is part of that compromise. Part of that compromise means that the regular laundry may fall to one person just as other regular chores may fall to the other. As such, a “good guy” may negotiate the chores arrangement such that you will be doing his laundry along with yours. As laundry can, at times, be an arduous task, make sure to negotiate for him to pick up two or three other smaller chores in exchange for that one.

At the same time, though, do not use sex as a motivational part of that negotiation. Once you start successfully doing that, you will likely find that you could use sex to get what you want. And once he becomes aware or suspicious that you’re doing that, he’ll be more likely to stray.

Compromise is a good thing, but sex should never be part of that compromise.

2. Buy gifts and cards for other people on his behalf.

We’ll help when we’re out with you, but no, we won’t make a pit stop at Hallmark and Laura Ashley while we’re shopping with the girls. Just because we’re women doesn’t mean we’re automatically adept at figuring out your Aunt Martha’s dress size.

This item has so many exceptions it’s ridiculous.

If he already knows what to buy for a family member and where to get it, and you happen to be going there or will be in the area, don’t be offended if he asks you to pick it up for him. It’ll save him the trip — and the time and gas that goes with it — and he might think of some way to thank you for it.

Obviously we’re not going to expect you to know what our family members will like, or what clothing sizes they take — and any guy who expects you to just magically know or just intuitively select the perfect gift for one of his family members should be kicked to the curb, but again there are exceptions. If you’ve spent a decent amount of time conversing and visiting with his family members, don’t be surprised if he asks your opinion on a gift, or, Lord forbid, even asks you to select something. But, should that occur, it should be given as a gift from both of you instead of just from him.

But then if you two are a couple, any gift you give to others will be assumed, if not expected, to be from the two of you.

3. Plan an entire vacation without his help.

When we ask you whether you’d rather spend our anniversary in Cabo or Vermont, we want you to express an actual preference, not to say, “Whatever, I’m happy with what makes you happy.” The same goes for the hotel, the airline, and the restaurant reservations. Letting us take the reins isn’t considerate, it’s just lazy and boring. Instead, make sure to divvy up the planning. We pick the location and hotel; you plan the activities.

Letting you take the reins in planning the vacation is quite different from asking you to plan the entire thing. But here’s the thing: he genuinely may not care about where you go. I really hope that doesn’t seem like a foreign concept.

One thing that drives my wife up a wall at times is when we’re going out to eat and she’ll ask my opinion on where to go. Typically my response is “whatever is good for you”, mainly because I genuinely do not care. And I’ve said that to her numerous times. I don’t do that to be lazy. We’ve been together long enough that she’s learned to instead ask if my preference is to sit-in or grab and go, but that I don’t really care where we go to eat, as unless I’m in the mood for something particular, I’ll be able to find something on the menu that’ll do just fine.

Now that’s just eating out, not an entire vacation. At the same time, it still applies.

Don’t be offended if it seems like you’re doing the majority of the planning, or think we’re somehow not a “good guy” because we’re not chipping in our opinion as often as you’d like, because we may not actually have an opinion on the matter. We may genuinely not really care about the specifics of a vacation.

In the last real vacation my wife and I took together, which was almost 8 years ago, I did most of the planning — in part because the expenses were going to be coming out of my bank account (we weren’t actually married yet). Sure we both selected the destination as we wanted to go to Busch Gardens in Williamsburg, VA, and spend a few days out there. Aside from that, the specifics on where we stayed, what vehicle we rented to get out there, and the like, were entirely on me because my wife didn’t really care about those specifics — and your significant other may not care either.

But then, if you’re engaged or married, you should already know that.

And if you’re still dating but not quite heading to the alter, this is a good opportunity to see how compatible you two really are. True that he should at least help you decide the destination and should contribute an opinion every now and then. But if the vacation is your idea, don’t be surprised or offended if you end up selecting most of the specifics. Also at the same time, again, he may not care about the specifics, something that can work quite well to your advantage when planning things.

Yes I know I’ve repeated that phrase several times, but then it seems like something some women just don’t actually get. They expect men to think like women, to care about things the same way they do, to have opinions on matters they care about, when we just genuinely may not give a shit. And if your guy is actually telling you “whatever makes you happy”, it likely means he truly doesn’t care and will go along with whatever you plan — within reason of course.

Want to snap him into actually helping you with the planning? Plan something unreasonable. But then that could end up backfiring, as doing something like that could also erode his trust in your ability to make sound, reasonable decisions.

4. Make him a sandwich.

Oh this old canard has rooted it’s ugly head again:

The refrigerator is 10 feet away and your game control has a pause button, so get up, stretch, and slap that ham and lettuce together by yourself. We don’t care if you’re “in the zone,” because apparently, you were out of it long enough to articulate your immediate need for a nibble. Maybe we’ll consider it if you agree to break from the game for 20 minutes, put on some coffee, and enjoy your afternoon snack with us.

Question: do you feel any qualms about asking him to make you a sandwich? If no, then shut up!

The only way this point is valid is if he’s rarely making his own snacks. But asking the other to do something, even something as innocuous as making a sandwich, is part of living together. It doesn’t matter if you think we could just hit the pause button and get up and do it. It’s about the same as in point 2 where you may be asked to do something simply because you happen to be in the area.

Now if you don’t want to do it, say so! A “good guy” won’t hold that against you. He’ll just wait till he reaches a good stopping point and do it himself.

And saying that you’ll “consider it if you agree to break from the game for 20 minutes” isn’t what’s being asked. He is asking you so that he doesn’t have to break from the game. And if you’re going to get offended at being asked to do something, even something as innocuous as preparing a snack, perhaps you need to reevaluate your perspective.

After all, have you ever asked him to make you a snack or fetch you a drink so you wouldn’t have to break from reading your book? If you’re going to ask it of him, expect him to ask it of you. At the same time, if he asks it of you, feel free to ask it of him as well.

5. Change your relationship status on Facebook.

We believe our life outside of the Internet should speak for itself. On the off-chance that we break up, wouldn’t you rather tell your close friends in person, rather than have that ever-present broken heart appear on 500 people’s newsfeeds? Well, we would, so don’t even ask us to include our relationship status on Facebook in the first place.

Yet at the same time, many women get offended when their guy doesn’t change their relationship status on Facebook to show that they’re “taken”. I’ve even seen it on lists of how to tell if your guy is “committed”. Just Google the phrase (without quotes) “he won’t change his relationship status on Facebook” to see the hypocrisy behind this notion.

Ladies, if you expect us to do it, don’t get offended if we want you to do it as well.

And one other thing: do you actually know 500 people? Clean out your Facebook friends list. This goes for both of you!

Good God, the only people that should be on your friends list are people you’ve actually known at some point. Not people you merely interacted with, or someone you’ve heard of because they’re friends or family of people you do know, but people you genuinely know. Before accepting or making a friend request, you should be able to answer, without much hesitation (hey, even at 34 my memory is sometimes a little slow) how you know that person. If you can’t answer that question, you shouldn’t link up with that person on Facebook.

Which means if you have enough people on your Facebook friends list to occupy the small town where I went to college, you have way, way too many. I can say to a reasonable degree of certainty that you do not know all the people on your Facebook friends list if you have that many. So clean it out!

6. Be his wake-up call.

If we wanted a newborn, we’d just pierce the condom. Kidding! But really, buy an alarm clock. Remembering a man’s nap and wakeup schedule should be an occasional favor, not an everyday obligation.

Let’s go back to point 4 about being asked to make a sandwich as what I said there applies here as well. It’s basically the same thing: both should be an occasional favor and shouldn’t seem like a burden or obligation. At the same time, the inclusion of this item in the list makes it sound like asking it even once or twice is reason to not consider him to be a “good guy” since the title of the article does say this is something a “good guy” will never ask you to do.

7. Take care of his drunk friends.

We’ll help them hail cabs or drive them home, but our couch really shouldn’t be a post-happy-hour crash pad.

Remember, this goes both ways. Don’t expect us to harbor your friends when they get wasted. But at the same time, understand why they may be staying overnight.

Back in college, my roommate and I had a group of mutual friends. They took him out for his 21st birthday — I hung out with my girlfriend — and he came back a little sloshed. I said to one of our mutual friends to keep him the next time it happens because of the racket he made coming in late. Thankfully there wasn’t a need for that to occur as he never went out drinking to that extent again during that academic year.

Going to skip 8 and 9 since I don’t have anything to add to them. The only thing to say is to just remember that those items also go both ways, much like most of the previous points.

10. Lose weight.

Time to tread lightly…

We’ll tone up for health purposes and for ourselves, but if you’re really concerned about the 5 lbs we gained over the holidays, don’t flat-out complain that we’re getting flabby. Instead, invite us to go biking with you or to take a yoga class together. Treat exercise as a fun activity we can do together instead of something we should do just for you.

Let me first point to two other articles I’ve written about this: “Divorcing over weight gain” and “Reply to Ragen Chastain“. Needless to say, there are exceptions to this — major exceptions to this.

My wife is overweight — obese, actually, by medical definitions. She’s gotten the “you need to lose weight” statements from her nurse sister, nutritionist mother, and also from doctors. We now know that there are underlying health concerns that contribute to her weight, and they are being addressed.

That said, the “if you’re really concerned about the 5 lbs we gained over the holidays” remark is situation specific.

If it comes during a string of weight gain, we have every reason to be concerned. If you went from 120 lbs when we first started dating and that extra 5 lbs is on top of an additional 40 lbs of weight you’ve gained in the interim, we not only have every right to be concerned, but every right to point it out.

Now this is provided you weren’t rail thin at 120 lbs. If we could see outlines of bones against your skin and now we can’t, yeah we shouldn’t complain about an extra 5 lbs, provided, again, that extra 5 lbs isn’t on top of an already unhealthy weight.

If you went from rail thin or a healthy weight to overweight, and added 5 lbs to that, that is reason to complain or be concerned. If we have reason to believe it to be a symptom of an unhealthy trend, we have reason to complain. If we invite you to fitness classes or biking trips and you refuse to join us, then we have even more reason to complain.

But, as I said in “Divorcing over weight gain”, that only holds true if he has kept his weight in check.

11. Keep our hair long.

Trust us, short hair is cute, fun, and just as feminine as back-length hair. Just look at Halle Berry, Audrey Tautou and Keira Knightley, circa 2005. It’s not as if we’re going to shave it off or sport one of Rihanna’s hairstyles, but even if we did, we hope you’d find us just as attractive.

If you want to keep your hair short, there’s really only one thing you need to discuss: maintenance. Shorter hair is much easier and less expensive to maintain, requiring less shampoo and conditioner along with less time in the shower. Now in the cost aspect we’re not talking about significant savings — perhaps a couple dollars a month depending on what shampoo and conditioner you use.

But on time the savings can be significant, especially if you have a finicky water heater or one that’s not really all that large and can be drained quickly. This means he may not have to wait nearly as long to jump in the shower, or if he’s typically quick anyway, it’s a better chance he’d have more than enough hot water available if he does. And if you shower together on the weekends, it can mean more time to just sit under the hot water together (depending on shower layout, of course).

The weight of longer hair can also contribute to headaches or exacerbate any scalp concerns.

Let’s directly address this notion: “It’s not as if we’re going to shave it off or sport one of Rihanna’s hairstyles, but even if we did, we hope you’d find us just as attractive.” You can hope all you want, but it may not happen. And if you expect that he’ll still find you attractive despite what could be a drastic change in appearance, hopefully you’ll do the same if he drastically changes his — provided you don’t first jump to the conclusion he’s cheating because of his drastic change in appearance (yes, I read those articles, too).

Again, much like the rest of this article, what you expect from your guy you’d better be open to him expecting of you.

How your hair is kept is more important than the length. As such, Audrey Tautou isn’t really a good example of a woman who looks attractive with short hair, in my opinion. All of the pictures I’ve seen of her with short hair don’t shine well on her simply because her hair is naturally wavy, and naturally wavy hair always looks better longer — I’ve yet to see an example of that not holding true, and the wavier the hair, the worst it looks when kept short unless it’s styled. Which brings me to Halle Berry. Her hair has a wave or curl to it, but she has typically had her hair styled, and keeping hair styled can get costly quickly. And she’s also almost always had short hair — anyone know of a time when that wasn’t true?

Keira Knightly‘s hair is also typically straight or pretty close to it, making it easier for her to keep it short or long and still look great, and she’s worn it both straight and long over the years as well. Jena Malone‘s hair is similar as well because it is straight.

Rihanna‘s styles over the years have been touch and go, I’ll admit. She’s had some nice hair styles, and some that were… questionable. So if you wanted to sport one of her styles, I think you’d have to be specific on which style you wanted to wear.

But getting back to the original point, a “good guy” may ask you to keep your hair longer. Longer hair is considered more attractive. My wife used to keep her hair really long. And I didn’t chide her for wanting to cut it short back in 2006 and keep it that way. She had very good reason for wanting to take her hair down a couple feet —  yes it was long enough they were able to take off 27″ for donation and still keep it what we both felt was a reasonable length. She didn’t want to go super short, and I didn’t want her to take it that short either.

What would make him not a “good guy” is if he obligates you to keep your hair long.

And that’s kind of the thing about the title of the article. A “good guy” may ask or request numerous things of you, even things you may not want to do. What he will never do is force you to do them or obligate you in some other way. Plus whatever you request of him you’d better be ready for him to request of you.

Rack mount HDD enclosure, part 6: Follow up with Protocase

Build Log:

After writing such high praise on the Protocase Designer software — which Protocase shared via their Facebook page and Twitter — I was torn on writing this next segment, but I must in order to be completely honest about my experience building this project.

Plus if I’ve had a questionable experience with a company, I’ve made it a point to write about it: CaseLabs (here), Performance PCs (here, here, and here), FrozenCPU (here), Amazon (here), FedEx (here and here), AquaTuning/AlphaCool and their insurance company… I believe in being honest about my experiences — and where I’ve previously talked about a problem I’ve had, I’ve not only said what the problem was, but how it was resolved.

So too it is now with Protocase.

The quote

The original quote I provided in part 5 of this build log was erroneous. For reference, this was that quote:

Rack quote

Before continuing, I need to fill in some other side details. While trying to figure out what I’m going to do for an enclosure, I’ve since got a proof of concept together for the internal components: I have 4x1TB WD Blue hard drives connected to the Addonics port multiplier and connected to my computer via eSATA, powered by a FlexATX power supply. The hard drives are configured for RAID 10 as I’d planned — I’ll elaborate more on how I got that set up later (hint: it didn’t involve the SIIG RAID card I’d been using).

DSC_0230

What prompted me to make that move a little sooner than expected was some hard drive corruption that unfortunately caused me to lose the original file I’d sent in to receive that quote. So I decided to build the proof of concept, partly as a test to make sure everything was going to work. After getting the RAID setup and everything re-installed, I re-downloaded Protocase Designer and rebuilt the file. Allow me to shine a bit more praise on the Protocase Designer: it only took me about 20 minutes to build out the new file from scratch to what I originally had, mostly thanks to the cutout library.

When I queried for an online quote, with the intent of actually placing an order, I received a bit of a shock:

hdd quote

That’s a bit of a disparity!

I wondered what I had done differently to get the drastically different quote, so I tried changing the enclosure’s materials and other options. I even built out a new file from scratch again and tried getting another quote. The numbers were about the same, so I wrote in to the sales team with the new file. After a few e-mails back and forth, they agreed to honor the original quote — the one for which I actually had the screenshot (lucky me in taking it) (emphasis mine):

The design team took a look and at the present time we don’t see any major concerns, this is not to say there wasn’t some type of error when you recreated your file. We did have a minor concern which was addressed immediately. You may have created your design when this was happening. We do apologize for this. The second price is a more accurate price.

If the design is needed I can work with you on the price. We can honour the price for you this time . If you would like to call I would be more than happy to talk to about this.

During the course of the conversation, I had made a couple additions to the design — the original design didn’t have any way out the back to exhaust the air, so I added a few 60mm fan mounts. In having made the changes, I proposed a new offer: 175 USD for the work and setup, plus shipping — which would bring it to shy of 200 USD. Basically I was taking advantage of the fact they were willing to honor the original quote, but since I’d made modifications, I proposed something a little higher.

After I sent the e-mail back to them with the proposal and new file, I went digging through my IE search history after noticing it appeared to have been preserved. One of the benefits of having your Windows 8.1 local account linked up to your Live profile is IE will automatically save off your history. And the original quote was still in my history. When I visited the page again, it showed that my original quote was indeed caused by a flaw on their end:

38273 recalculated

Quote No. 38273, quoted at 299.92 USD plus setup with free shipping. Contrast that with the original quote of 79.38 USD plus setup and another 18 USD for shipping. Yeah there was definitely something not right when I got the original quote.

RAID setup

Let’s briefly segue into the RAID 10 setup. In a parenthetical above, I noted that the RAID is not going through the SIIG card I had installed. One thing not mentioned in the manual or anywhere is that it will not allow you to create a RAID including drives on a port multiplier. So I have it going through my mainboard’s BIOS. I had to turn off the RAID’s caching options as well to keep it stable — with caching turned on, it would periodically crash and eventually got to a point where the RAID controller kept telling me 3 drives were offline.

There is a slight performance penalty turning off caching, but it’s not anything serious. I created a backup image of the new installation using Macrium Reflect, which showed read throughput of over 2.5 Gb/s — note that’s nearly a 70% improvement over the specified sustained read rate for the WD Blue. Still slower than an SSD, but I think SSDs are overrated anyway. And if anyone wants to challenge me on that, feel free, as I can tell you my experience with using an SSD heavily on a software engineering computer for the last 3, approaching 4 years. It’ll probably change your perspective on how good they actually are.

After the weekend

The following Monday after sending the e-mail, I received a reply back from the Protocase representative. I didn’t hear back till mid-afternoon, and, until I received it, I wondered if my e-mail was filtered out for having an attachment like the e-mail I initially tried to send to CaseLabs (see link above). After first apologizing for also not responding nearly as quickly, the representative informed me that she wanted to put my proposal in front of her boss — which I expected — and then reiterated that the erroneous quote was likely caused by an issue on their end: “Since this was our mistake I know it won’t be a problem to work with you based on your proposal.”

Unfortunately her boss was expected to be out of office till Friday, so I inquired if there was someone else to whom she could forward the proposal to be reviewed. I wasn’t on a time crunch for this, and the proof of concept was holding stable, but if things could get turned around sooner, then I’d go for sooner — especially since they have a 2 to 3 day lead time once the design goes into production. She also stated in her e-mail that the design would also need to be forwarded to the design team for review, which is part of their procedure.

The next morning, the representative decided to not wait and instead gave the necessary authorizations while making sure to document why she was selling me a custom enclosure for what amounts to a 40% discount. The order was pushed through with a total of 193 USD — 175 USD plus 18 USD for shipping. The order has gone through all necessary approval steps and should now be going into production.

I’ll update the series when I have the enclosure in hand, which will probably be toward the end of the week. Production should take two to three days and be shipped by 2-day shipping.

Concealed carry without a permit

Also known as “constitutional carry”, a number of States do not have a permit procedure to carry a firearm concealed. More States are now considering such allowances. The Washington Post recently reported on it, and said this:

The American public has recently been tilting toward gun rights; a Pew poll last month showed guns rights supporters pulling ahead of gun control supporters 52 to 46.

But Americans also want background checks, which permitless concealed-carry laws could do away with. A Quinnipac poll last year also showed that an overwhelming majority of voters, both Democrat and Republican, support background checks for all gun purchases. A similar majority would also bar people suffering from mental illness from purchasing guns.

“Permitless concealed carry” laws would not do away with background checks, since Federal law still requires an NICS check for every firearms purchase through a Federally-licensed (FFL) firearms dealer.

More States are pushing for “constitutional carry” simply to eliminate the ability for law enforcement to deny permits on faulty grounds. For example, after Illinois was basically forced by a Federal Court to allow concealed carry, the Illinois State Police just started denying permits without reason — and it appeared to be doing so to predominantly black applicants, even those without any criminal record or history of any kind. Part of that is due to Illinois being a “may issue” State, not a “shall issue” state like Missouri.

Speaking of Illinois, being arrested too many times — regardless of whether charges are actually pursued — can also disqualify you for a concealed weapons permit. And they can also deny a permit if they have “determined by a preponderance of the evidence that you pose a danger to yourself or others/are a threat to public safety“.

But beyond that, background checks for concealed carry permits tend to pull up things that don’t bar a person from actually possessing a gun. A person convicted of a felony as a juvenile who has had that record expunged may be able to obtain a firearm once they reach appropriate legal age (18 for long guns, 21 for pistols), but that previous record could come back to bite them when they apply for a concealed carry permit.

For example, one person wrote on the Expert Law forum detailing his experience in which a juvenile misdemeanor offense was used against him to deny him a permit:

My question involves criminal records for the state of: California

When I was 14yrs old I was arrested for fighting at school. I took a plea bargain as a infraction was fined 35.00 and given 2 day of work detail. I had my record sealed when I was 18 and have not been in trouble ever since. I have TS clearance in the military last done 2009. I am now 41 years old and have been denied my concealed weapons permit in the state of Georgia for a battery charge in a city (that doesn’t exist on the same date that I got into the fight in 1987. I called the county juvenile department and was told the records do not exist and would not have followed me as it was a infraction. I spoke to the probate court in Georgia they said they never seen anything quite like this before due to the fact that it only had a date and charge and had to deny the permit as a open case against me on file.

On the forum The High Road, a corrections officer in the State of Georgia found himself denied a permit due to clerical errors with the Court:

my so called terroristic threat charge from when I was sixteen (which was simply a cruel rumor started by classmates which is why it held no legal grounds) is still listed as a charge on my record. The probate judge who handles CCW’s did not have any info other than the actual charge on hand…..he had no information on what happened to the charge after it was filed almost 10 years ago. So I had to explain what happened and show him a peice of paper from the DA’s office that stated that the charge was dismissed almost as soon as it was written.

He was also charged with being a minor in possession of alcohol, which is a simple misdemeanor.

And YouTube user MrAk47master posted a video on March 4, 2011, showing the letter in which he was denied a concealed weapons permit by the prosecuting attorney’s office for Houghton County, Michigan, for a juvenile record in California that, under Michigan law, would’ve been a felonies if they were adjudicated in an adult court. The juvenile record does not prevent him from purchasing firearms.

So this is a person who was never convicted of a felony and so is not barred by Federal law from purchasing a firearm. His juvenile record does show he has misdemeanor convictions in the State of California. If he were an adult and those charges adjudicated in criminal court, as opposed to juvenile court, then they would’ve been felony-level charges under Michigan law (who knows what they would’ve been under California law), and Michigan used that to deny him a concealed weapons permit.

Now while a lot of people might say that such records should be disqualify someone from getting a concealed weapons permit, I must vehemently disagree.

You see in the United States we have this thing called “due process”. And under due process, you cannot refer to someone as a felon unless they have actually been convicted of a felony! The due process clause states, quite plainly, that you may not take away a person’s rights without due process of law. And before a person can be deprived under Federal law of their Second Amendment rights, they must be a felon, which, again, requires they be actually convicted on felony charges. And if the Court determines that a person’s juvenile record should be sealed, then that means it should not be used against them later in life — not as an impediment to obtaining a firearm, and not as an impediment to carrying that firearm.

But to gain the ability to legally carry a firearm concealed in public, we have to file application for what should be readily recognized as a right given the Second Amendment. And that application process operates on a “guilty unless a search comes back empty” premise. I’m sorry, but that’s not how it works.

And at the same time, a person who has not been convicted of a felony should not be denied a concealed carry permit simply because the charges on which a person was actually convicted would have been felony charges if the circumstances were a little different.

That is what “constitutional carry” laws prevent. A person who is legally able to purchase a pistol should be legally able to carry that pistol, concealed or open, without first having to apply to the government for permission to do so.

Absinthe – Part XX

Build Log:

Okay so for Valentine’s Day I bought my wife an R9 290X to replace her GTX 660 pair.

Previously I’d talked about going with a GTX 970, mainly because they are a little better than the R9 290X on performance while consuming a lot less power — as in one R9 290X uses about the same power as a pair of GTX 970s. So why did I not go that route? The 3.5GB memory limitation on what is advertised as a 4GB card.

While for most that memory limitation won’t make a difference, I know how my wife runs her system. For most people 8GB of RAM is more than enough. For my wife, it was barely breaking even, so she got bumped to 16GB, and so far she’s running smooth. It’s also why I water cooled her system: when I first built out the computer that would become Absinthe, I saw her taking massive advantage of the new power her system had, and the temperatures were getting me a little worried. In a way that shows that AMD makes good processors and ASRock and ASUS make good mainboards — she runs her computer pretty damn hard.

We’ve done Radeon’s in the past, and it was in Q4 2013 with a GT 620 that I bought the first “team green” nVidia card for my wife’s computer, which was originally built out in 2007 (AMD Athlon X2 with 4GB RAM). That would eventually get upgraded to the water cooled GTX 660 pair.

And now it’s a single R9 290X. Will it become a pair as well? We’ll see. But yes, #RAMgate is the reason I avoided the GTX 970. I did not want to chance that becoming an issue with how my wife can run her system. I don’t know if my wife is aware the GTX 970’s limitation. If she is, then she’d probably opt for the R9 290X as well and not chance the GTX 970.

So specifically the R9 290X I bought is the XFX “Double-D” model. Initially the card was installed into her system on Valentine’s Day while she was at work. It required modifying her loop in a couple important ways.

For one, an R9 290X is a full-length PCB, but the cooler on the XFX model extends beyond that by another inch or close to it. Contrast that with the relatively short PCB of the GTX 660, which is comparable to the GTX 970 — one of the other reasons I was considering that card. The longer card meant I needed to push her pump and reservoir over beyond the edge of the UN Z2 bracket to which it was mounted.

I also didn’t have a water block for it immediately. Even if I did, I wanted my wife stressing the card for a couple weeks before mounting the block. So to get the card in her system without throwing everything back on air, I just bypassed the graphics card and ran a piece of soft tubing from the bottom radiator to the top — this put her CPU on 7x120mm of radiator capacity. For the coolant, it was just distilled water.

So that’s how she ran the system for two weeks. In the mean time, I looked around at blocks to decide which direction to go. In the end, going on an in-depth analysis I read online, I went with the AquaComputer KryoGraphics block with the backplate. The analysis showed the card to be the best overall block for the R9 290X. It tested to within 1C of the top performing block for GPU temperatures, which was Swiftech’s Komodo block, but the cooling of the VRMs with the backplate topped the list by a significant margin.

I ordered the block through Performance-PCs — with the recent FrozenCPU hiatus, they were the only stateside distributor. And they had one backplate and two of the blocks in stock when I ordered. The only other option would’ve been ordering directly from the manufacturer. It would’ve been a similar price, but would’ve included a nearly 2-week wait time before shipping. FrozenCPU had the block, but not the passive backplate.

The one thing I found rather intriguing is how the block is shipped vacuum packed:

Unfortunately due to the washout from the flash, the full beauty of this block can’t really be made out, and I forgot to take pictures of the card with the block installed. The promotional picture on Aquacomputer’s site does it more justice anyway (see below). It’s a great looking block, and it was a relative breeze to install — but then I’ve had a bit of practice doing that. It’s also the only block I know of that has you use thermal compound on the memory chips instead of thermal pads. The backplate was also easy to install.

Yes, that’s the Hawaii islands cut into the copper cold plate. It’s manufacturing detail that’s mostly wasted as well given that, in most builds, that face of the block will be toward the bottom of the case and not in ready sight.

Anyway, once I got the card installed, it was a matter of doing the rest of the loop.

This wasn’t as easy as it sounds, merely because I kept overthinking some things.

First, in noticing how the pump and reservoir were barely held down with how I had them mounted, I thought the better idea would be to have it straddle two UN Z2 brackets in the middle of both of the 120mm fans on the bottom radiator. Only problem is having the Z2 bracket on the fan closest — actually directly underneath the front radiator — interfered with the front radiator and meant I couldn’t mount the front radiator. So scratch that idea.

So I thought I needed to separate the pump and reservoir and started exploring that idea. That introduced quite a few complications into the equation, and it wasn’t until my back was screaming at me from working on this all day that I realized something I should have hours earlier: removing the stock cooler shortened the graphics card by up to an inch. This meant that I could keep the pump and reservoir mounted to the bracket and held a bit more securely to it.

So I tried that idea — and it worked! — and went to bed, after having my wife try to get as many kinks out of my back as possible. I saved finishing the tubing and getting it filled for a leak test till the next day.

Looking at that picture, you can probably see how it’s mounted at the edge of the bracket. It’s very similar to how I had it mounted in the initial Absinthe build, only on the opposite side of the bracket. It’s sturdy, held in place by two M4 bolts from underneath, and the tubing running from the CPU back to the top of the reservoir also helps keep it in place.

You can see off to the left of the pump the new run from the bottom radiator to the graphics card — just one giant 90-degree bent piece of copper tubing. And I went with another 45-degree bend to go from the graphics card to the top radiator.

The windowed lid on the block makes it very easy to tell when there are large bubbles still trapped in the block. I was tilting the case to all kinds of different extremes to get the largest bubbles out of the block as I wanted those gone before I powered on the system.

And initial performance numbers from the block are certainly quite pleasing — ambient temperature was about 20C:

It topped out at 45C after running three cycles through Unigine Heaven. The stock cooler would top out in the lower 70s. On cold boot, it idled in the lower 30s — not unexpected given the hotter profile for the card. I expect temperatures may improve as the last of the air works its way out of the blocks and radiators, but it won’t be a significant improvement.

On or off?

It’s a reasonable question to be asking: should you leave your computer on or turn it off every night? Recently Simon Hill of Digital Trends tackled this question, bringing in Steven Leslie from Geek Squad to assist.

The biggest argument for leaving your computer on constantly is that it is less damaging to your components in the long run. This is only partly true, and it’s based on what I’d considered a somewhat flawed observation. The time when a computer part is most likely to fail is when it is being powered on, not while it is running — don’t read that to mean a part won’t fail while it is running, because it most certainly can, it’s just not as likely to do so, just as your car is more likely to fail to start than die while running or idling.

All computer components have a rating called MTBF, or mean time between failure. “Mean time” means average. If a part is rated at 50,000 hours MTBF (typical for a computer hard drive), then that means the manufacturer has determined that the average lifespan for a component would be 50,000 hours of constant running — or it should be able to run continuously powered for over 2083 days, or about 5 years and 8 to 9 months. Now given this is the average rating, your part could die in a year run continuously, or it could die in 10 years.

So the idea goes that powering off your computer every day should extend the lifespan of your computer components because the MTBF is rated on constant use. The slight “surge” in powering on your components isn’t going to cause any significant stress to your components unless you’ve got bad wiring in your home or you’re connected to a substandard power grid. But if that were the case, and you’re not using an uninterruptible power supply (UPS, and I’m not talking about the courier), surge suppressor or power conditioner, you’re already causing damage to your computer whenever it’s powered on.

Modern desktop power supplies can more robustly handle substandard power delivery coming from the wall, but there are still limits beyond which that is not true, and that is in part still determined by what your computer requires to operate — i.e. if your computer is trying to draw 500W from the wall, but is having a difficult time getting it due to the power delivery in your home, you’re going to have stability issues. To find out the quality of the power delivery on your grid, you’ll need to talk to your power company. If the grid is fine, you may need to upgrade the wiring in your home — this will have significant benefits beyond your computer, as it could increase the lifespan of everything in your home, including major appliances, and possibly improve energy efficiency while also decreasing the possibility of a fire due to faulty electrical lines in your home.

Now this doesn’t change the fact that a component is still most likely to die when powering on. That’s just the nature of any electrical component.

In his article, Hill writes this: “A traditional hard disk drive, for example, has moving parts, whereas a solid state drive doesn’t and is far more robust as a result.” Okay let’s tackle this idea.

An SSD, or solid state drive, does not have moving parts. As such it consumes much less power and will run cooler and quieter, and is much less prone to shock damage (as in from a striking blow, not electrical shock). But they are significantly more expensive than traditional platter hard drives (HDDs). But whether they are more robust depends on how you use them.

Here’s the caveat: you don’t want to use them in systems where there will be very frequent changes to the data stored on it. Content creators, software developers, and the like really should not use an SSD as primary storage. Using it to store the operating system and software is one thing, to ensure that programs load quickly, and it’s information that is unlikely to change frequently.

But use a traditional HDD for storing work files and the like.

This is because SSD performance degrades over time, and will degrade faster if you don’t keep much free space on it while using it as a primary drive, or are making frequent writes, deletes, or rewrites to the data on the drive. You can keep this from occurring nearly as much by having multiple SSDs configured in a RAID 0 configuration (beyond the scope of this article), thereby spreading writes and rewrites across multiple drives. But you’d likely still be better off having a platter drive, especially since platter drives are much less expensive, and capacity is going to be more important than speed — doesn’t matter how fast your data is saved and loaded if you can’t store it.

The laptops issued by my employer have SSDs in them, and my upcoming device upgrade will also have an SSD. I’m actually considering talking to device support about moving over to a platter HDD, even if I have to pay for the drive out of my own pocket. This is in part due to capacity — see my previous paragraph. The drive I currently have is only 120GB, and the drive in the new laptop is 256GB, and as a software engineer, I prefer capacity over speed, so I’ll be discussing putting a larger platter hard drive in the device in its place. Word has it I may be able to have an HDD alongside the SSD, so I may explore that option as well.

SSDs are faster, but, again, their performance will degrade over time, and the SSD in my laptop is currently not much faster than a traditional platter drive. This is in part because my SSD — like those of my colleagues — has about 75% of its capacity currently used. That will cause an SSD’s performance to degrade faster because the algorithms the SSD’s firmware uses to prevent that performance degradation from occurring cannot work nearly as well.

Several of my colleagues have had to replace their SSDs over the last couple years due to the drive failing. The fact we also use full-drive encryption at work I think plays into that.

Whether you will encounter those limitations depends on what you do with your system, but do not ignore the fact that any drive, HDD or SSD, can fail on you at any time, so keep regular backups of important data. And the power delivery to and inside the system is what is most likely to cause a component to fail.

Hill provided a couple lists of reasons to leave your computer on or turn it off. The first reason to leave it on is if you’re using the system as a server, which most aren’t doing that, so it’s a moot point. The other two reasons are perfectly applicable:

  • There are background updates, virus scans, or other activities you’d like to occur while you’re away.
  • You never want to wait for it to start up.

It’s always best to have scans and updates occur during downtime, especially if the updates require a reboot — which means even if you’re not shutting down every night, at least have the system in a state where a reboot can automatically occur if it must. Speaking of which, even if you don’t shut down every night, a nightly reboot can still be beneficial.

Now let’s talk about the reasons to turn it off:

  • Leaving it on wastes electricity and can slightly increase your power bill.
  • You don’t want to be disturbed by notifications or fan noise.
  • Computer performance generally benefits from an occasional reboot.

Of these, the last is kind of nonsensical as you can manually reboot your computer. You can do a reboot in the morning while making coffee, or at night before going to bed. If you don’t want to be disturbed by fan noise or notifications, then put your system in another room so it’s not going to disturb you. And if the fans in your system are loud, replacing them is always an option, as there are plenty of options available for quiet fans that still push a good amount of air.

Leaving your system on will obviously mean it’s using power. Your system does not draw a continuous level of power from the wall and will only draw what it needs up to the rated power of the power supply. So if  you’re leaving your system idling overnight, it’s increasing your power bill by only a few dollars a month, so shutting it down every night is probably not going to make a huge difference on your power bill, especially if the amount if time it’s off will always be less than the amount of time it’s on.

How little power it draws when idling depends on the processor, mainboard and other components. For example Western Digital has the WD Green drive which is designed to spin down when not in use — this makes the drive undesirable for anything other than backup storage. Modern processors are actually fairly power efficient today, with technologies built in that allow them to sip power when idling. The efficiency of the computer’s power supply will also determine how much power the system uses while idling.

Then there’s the question of sleep or hibernate. Just like powering off, hibernate will only be of significant benefit if the computer is going to be off more than it will be on.

Putting your computer into a sleep state is also a beneficial option, but with one caveat: make sure your system is configured so the mouse cannot wake it up. If you have a wireless mouse, turn it off before putting your computer into sleep. A lot of optical mice on the market have very high resolution tracking on them, meaning the slightest shift on the mouse could wake your computer unexpectedly, negating the benefit of putting it into a sleep state. Hard drives spin down when the system is put to sleep, so unexpected wake-ups caused by a jolt that causes the mouse to report it’s moved will spin it back up sooner, and potentially more frequently.