Console gaming is hardly different from PC gaming, and much of what people say about PC gaming to put it above console gaming is often wrong.
I’m not sure about you, but for the past few years, I’ve been hearing people go on and on about PCs "superiority" to the console market. People cite various reasons why they believe gaming on a PC is “objectively” better than console gaming, often for reasons related to power, costs, ease-of-use, and freedom. …Only problem: much of what they say is wrong. There are many misconceptions being thrown about PC gaming vs Console gaming, that I believe need to be addressed. This isn’t about “PC gamers being wrong,” or “consoles being the best,” absolutely not. I just want to cut through some of the stuff people use to put down console gaming, and show that console gaming is incredibly similar to PC gaming. I mean, yes, this is someone who mainly games on console, but I also am getting a new PC that I will game on as well, not to mention the 30 PC games I already own and play. I’m not particularly partial to one over the other. Now I will mainly be focusing on the PlayStation side of the consoles, because I know it best, but much of what I say will apply to Xbox as well. Just because I don’t point out many specific Xbox examples, doesn’t mean that they aren’t out there.
“PCs can use TVs and monitors.”
This one isn’t so much of a misconception as it is the implication of one, and overall just… confusing. This is in some articles and the pcmasterrace “why choose a PC” section, where they’re practically implying that consoles can’t do this. I mean, yes, as long as the ports of your PC match up with your screen(s) inputs, you could plug a PC into either… but you could do the same with a console, again, as long as the ports match up. I’m guessing the idea here is that gaming monitors often use Displayport, as do most dedicated GPUs, and consoles are generally restricted to HDMI… But even so, monitors often have HDMI ports. In fact, PC Magazine has just released their list of the best gaming monitors of 2017, and every single one of them has an HDMI port. A PS4 can be plugged into these just as easily as a GTX 1080. I mean, even if the monitoTV doesn’t have HDMI or AV to connect with your console, just use an adaptor. If you have a PC with ports that doesn’t match your monitoTV… use an adapter. I don’t know what the point of this argument is, but it’s made a worrying amount of times.
“On PC, you have a wide range of controller options, but on console you’re stuck with the standard controller."
Are you on PlayStation and wish you could use a specific type of controller that suits your favorite kind of gameplay? Despite what some may believe, you have just as many options as PC. Want to play fighting games with a classic arcade-style board, featuring the buttons and joystick? Here you go! Want to get serious about racing and get something more accurate and immersive than a controller? Got you covered. Absolutely crazy about flying games and, like the racers, want something better than a controller? Enjoy! Want Wii-style motion controls? Been around since the PS3. If you prefer the form factor of the Xbox One controller but you own a PS4, Hori’s got you covered. And of course, if keyboard and mouse it what keeps you on PC, there’s a PlayStation compatible solution for that. Want to use the keyboard and mouse that you already own? Where there’s a will, there’s a way. Of course, these aren’t isolated examples, there are plenty of options for each of these kind of controllers. You don’t have to be on PC to enjoy alternate controllers.
“On PC you could use Steam Link to play anywhere in your house and share games with others.”
PS4 Remote play app on PC/Mac, PSTV, and PS Vita. PS Family Sharing. Using the same PSN account on multiple PS4s/Xbox Ones and PS3s/360s, or using multiple accounts on the same console. In fact, if multiple users are on the same PS4, only one has to buy the game for both users to play it on that one PS4. On top of that, only one of them has to have PS Plus for both to play online (if the one with PS Plus registers the PS4 as their main system). PS4 Share Play; if two people on separate PS4s want to play a game together that only one of them owns, they can join a Party and the owner of the game can have their friend play with them in the game. Need I say more?
“Gaming is more expensive on console.”
Part one, the Software This is one that I find… genuinely surprising. There’s been a few times I’ve mentioned that part of the reason I chose a PS4 is for budget gaming, only to told that “games are cheaper on Steam.” To be fair, there are a few games on PSN/XBL that are more expensive than they are on Steam, so I can see how someone could believe this… but apparently they forgot about disks. Dirt Rally, a hardcore racing sim game that’s… still $60 on all 3 platforms digitally… even though its successor is out.
See my point? Often times the game is cheaper on console because of the disk alternative that’s available for practically every console-available game. Even when the game is brand new. Dirt 4 - Remember that Dirt Rally successor I mentioned?
Yes, you could either buy this relatively new game digitally for $60, or just pick up the disk for a discounted price. And again, this is for a game that came out 2 months ago, and even it’s predecessor’s digital cost is locked at $60. Of course, I’m not going to ignore the fact that Dirt 4 is currently (as of writing this) discounted on Steam, but on PSN it also happens to be discounted for about the same amount. Part 2: the Subscription Now… let’s not ignore the elephant in the room: PS Plus and Xbox Gold. Now these would be ignorable, if they weren’t required for online play (on the PlayStation side, it’s only required for PS4, but still). So yes, it’s still something that will be included in the cost of your PS4 or Xbox One/360, assuming you play online. Bummer, right? Here’s the thing, although that’s the case, although you have to factor in this $60 cost with your console, you can make it balance out, at worst, and make it work out for you as a budget gamer, at best. As nice as it would be to not have to deal with the price if you don’t want to, it’s not like it’s a problem if you use it correctly. Imagine going to a new restaurant. This restaurant has some meals that you can’t get anywhere else, and fair prices compared to competitors. Only problem: you have to pay a membership fee to have the sides. Now you can have the main course, sit down and enjoy your steak or pasta, but if you want to have a side to have a full meal, you have to pay an annual fee. Sounds shitty, right? But here’s the thing: not only does this membership allow you to have sides with your meal, but it also allows you to eat two meals for free every month, and also gives you exclusive discounts for other meals, drinks, and desserts. Let’s look at PS Plus for a minute: for $60 per year, you get:
2 free PS4 games, every month
2 free PS3 games, every month
1 PS4/PS3 and Vita compatible game, and 1 Vita-only game, every month
Exclusive/Extended discounts, especially during the weekly/seasonal sales (though you don’t need PS Plus to get sales, PS Plus members get to enjoy the best sales)
access to online multiplayer
So yes, you’re paying extra because of that membership, but what you get with that deal pays for it and then some. In fact, let’s ignore the discounts for a minute: you get 24 free PS4 games, 24 free PS3 games, and 12 Vita only + 12 Vita compatible games, up to 72freegames every year. Even if you only one of these consoles, that’s still 24 free games a year. Sure, maybe you get games for the month that you don’t like, then just wait until next month. In fact, let’s look at Just Cause 3 again. It was free for PS Plus members in August, which is a pretty big deal. Why is this significant? Because it’s, again, a $60 digital game. That means with this one download, you’ve balanced out your $60 annual fee. Meaning? Every free game after that is money saved, every discount after that is money saved. And this is a trend: every year, PS Plus will release a game that balances out the entire service cost, then another 23 more that will only add icing to that budget cake. Though, you could just count games as paying off PS Plus until you hit $60 in savings, but still. All in all, PS Plus, and Xbox Gold which offers similar options, saves you money. On top of that, again, you don't need to have these to get discounts, but with these memberships, you get more discounts. Now, I’ve seen a few Steam games go up for free for a week, but what about being free for an entire month? Not to mention that; even if you want to talk about Steam Summer Sales, what about the PSN summer sale, or again, disc sale discounts? Now a lot of research and math would be needed to see if every console gamer would save money compared to every Steam gamer for the same games, but at the very least? The costs will balance out, at worst. Part 3, the Systems
Xbox and PS2: $299
Xbox 360 and PS3: $299 and $499, respectively
Xbox One and PS4: $499 and $399, respectively.
Rounded up a few dollars, that’s $1,000 - $1,300 in day-one consoles, just to keep up with the games! Crazy right? So called budget systems, such a rip-off. Well, keep in mind that the generations here aren’t short. The 6th generation, from the launch of the PS2 to the launch of the next generation consoles, lasted 5 years, 6 years based on the launch of the PS3 (though you could say it was 9 or 14, since the Xbox wasn’t discontinued until 2009, and the PS2 was supported all the way to 2014, a year after the PS4 was released). The 7th gen lasted 7 - 8 years, again depending on whether you count the launch of the Xbox 360 to PS3. The 8th gen so far has lasted 4 years. That’s 17 years that the console money is spread over. If you had a Netflix subscription for it’s original $8 monthly plan for that amount of time, that would be over $1,600 total. And let’s be fair here, just like you could upgrade your PC hardware whenever you wanted, you didn’t have to get a console from launch. Let’s look at PlayStation again for example: In 2002, only two years after its release, the PS2 retail price was cut from $300 to $200. The PS3 Slim, released 3 years after the original, was $300, $100-$200 lower than the retail cost. The PS4? You could’ve either gotten the Uncharted bundle for $350, or one of the PS4 Slim bundles for $250. This all brings it down to $750 - $850, which again, is spread over a decade and a half. This isn’t even counting used consoles, sales, or the further price cuts that I didn’t mention. Even if that still sounds like a lot of money to you, even if you’re laughing at the thought of buying new systems every several years, because your PC “is never obsolete,” tell me: how many parts have you changed out in your PC over the years? How many GPUs have you been through? CPUs? Motherboards? RAM sticks, monitors, keyboards, mice, CPU coolers, hard drives— that adds up. You don’t need to replace your entire system to spend a lot of money on hardware. Even if you weren’t upgrading for the sake of upgrading, I’d be amazed if the hardware you’ve been pushing by gaming would last for about 1/3 of that 17 year period. Computer parts aren’t designed to last forever, and really won’t when you’re pushing them with intensive gaming for hours upon hours. Generally speaking, your components might last you 6-8 years, if you’ve got the high-end stuff. But let’s assume you bought a system 17 years ago that was a beast for it’s time, something so powerful, that even if it’s parts have degraded over time, it’s still going strong. Problem is: you will have to upgrade something eventually. Even if you’ve managed to get this far into the gaming realm with the same 17 year old hardware, I’m betting you didn’t do it with a 17 year Operating System. How much did Windows 7 cost you? Or 8.1? Or 10? Oh, and don’t think you can skirt the cost by getting a pre-built system, the cost of Windows is embedded into the cost of the machine (why else would Microsoft allow their OS to go on so many machines). Sure, Windows 10 was a free upgrade for a year, but that’s only half of it’s lifetime— You can’t get it for free now, and not for the past year. On top of that, the free period was an upgrade; you had to pay for 7 or 8 first anyway. Point is, as much as one would like to say that they didn’t need to buy a new system every so often for the sake of gaming, that doesn’t mean they haven’t been paying for hardware, and even if they’ve only been PC gaming recently, you’ll be spending money on hardware soon enough.
“PC is leading the VR—“
Let me stop you right there. If you add together the total number of Oculus Rifts and HTC Vives sold to this day, and threw in another 100,000 just for the sake of it, that number would still be under the number of PSVR headsets sold. Why could this possibly be? Well, for a simple reason: affordability. The systems needed to run the PC headsets costs $800+, and the headsets are $500 - $600, when discounted. PSVR on the other hand costs $450 for the full bundle (headset, camera, and move controllers, with a demo disc thrown in), and can be played on either a $250 - $300 console, or a $400 console, the latter recommended. Even if you want to say that the Vive and Rift are more refined, a full PSVR set, system and all, could cost just over $100 more than a Vive headset alone. If anything, PC isn’t leading the VR gaming market, the PS4 is. It’s the system bringing VR to the most consumers, showing them what the future of gaming could look like. Not to mention that as the PlayStation line grows more powerful (4.2 TFLOP PS4 Pro, 10 TFLOP “PS5…”), it won’t be long until the PlayStation line can use the same VR games as PC. Either way, this shows that there is a console equivalent to the PC VR options. Sure, there are some games you'd only be able to play on PC, but there are also some games you'd only be able to play on PSVR. …Though to be fair, if we’re talking about VR in general, these headsets don’t even hold a candle to, surprisingly, Gear VR.
“If it wasn’t for consoles holding devs back, then they would be able to make higher quality games.”
This one is based on the idea that because of how “low spec” consoles are, that when a developer has to take them in mind, then they can’t design the game to be nearly as good as it would be otherwise. I mean, have you ever seen the minimum specs for games on Steam? GTA V
Actually, bump up all the memory requirements to 8 GBs, and those are some decent specs, relatively speaking. And keep in mind these are the minimum specs to even open the games. It’s almost as if the devs didn’t worry about console specs when making a PC version of the game, because this version of the game isn’t on console. Or maybe even that the consoles aren’t holding the games back that much because they’re not that weak. Just a hypothesis. But I mean, the devs are still ooobviously having to take weak consoles into mind right? They could make their games sooo much more powerful if they were PC only, right? Right? No. Not even close. iRacing
CPU: Intel Core i3, i5, i7 or better or AMD Bulldozer or better
Memory: 8 GB RAM
GPU: NVidia GeForce 2xx series or better, 1GB+ dedicated video memory / AMD 5xxx series or better, 1GB+ dedicated video memory
These are PC only games. That’s right, no consoles to hold them back, they don’t have to worry about whether an Xbox One could handle it. Yet, they don’t require anything more than the Multiplatform games. Subnautica
So what’s the deal? Theoretically, if developers don’t have to worry about console specs, then why aren’t they going all-out and making games that no console could even dream of supporting? Low-end PCs. What, did you think people only game on Steam if they spent at least $500 on gaming hardware? Not all PC gamers have gaming-PC specs, and if devs close their games out to players who don’t have the strongest of PCs, then they’d be losing out on a pretty sizable chunk of their potential buyers. Saying “devs having to deal with consoles is holding gaming back” is like saying “racing teams having to deal with Ford is holding GT racing back.” A: racing teams don’t have to deal with Ford if they don’t want to, which is probably why many of them don’t, and B: even though Ford doesn’t make the fastest cars overall, they still manage to make cars that are awesome on their own, they don’t even need to be compared to anything else to know that they make good cars. I want to go back to that previous point though, developers having to deal with low-end PCs, because it’s integral to the next point:
“PCs are more powerful, gaming on PC provides a better experience.”
This one isn’t so much of a misconception as it is… misleading. Did you know that according to the Steam Hardware & Software Survey (July 2017) , the percentage of Steam gamers who use a GPU that's less powerful than that of a PS4Slim’s GPU is well over 50%? Things get dismal when compared to the PS4 Pro (Or Xbox One X). On top of that, the percentage of PC gamers who own a Nvidia 10 series card is about 20% (about 15% for the 1060, 1080 and 1070 owners). Now to be fair, the large majority of gamers have CPUs with considerably high clock speeds, which is the main factor in CPU gaming performance. But, the number of Steam gamers with as much RAM or more than a PS4 or Xbox One is less than 50%, which can really bottleneck what those CPUs can handle. These numbers are hardly better than they were in 2013, all things considered. Sure, a PS3/360 weeps in the face of even a $400 PC, but in this day in age, consoles have definitely caught up. Sure, we could mention the fact that even 1% of Steam accounts represents over 1 million accounts, but that doesn’t really matter compared to the 10s of millions of 8th gen consoles sold; looking at it that way, sure the number of Nvidia 10 series owners is over 20 million, but that ignores the fact that there are over 5 times more 8th gen consoles sold than that. Basically, even though PCs run on a spectrum, saying they're more powerful “on average” is actually wrong. Sure, they have the potential for being more powerful, but most of the time, people aren’t willing to pay the premium to reach those extra bits of performance. Now why is this important? What matters are the people who spent the premium cost for premium parts, right? Because of the previous point: PCs don’t have some ubiquitous quality over the consoles, developers will always have to keep low-end PCs in mind, because not even half of all PC players can afford the good stuff, and you have to look at the top quarter of Steam players before you get to PS4-Pro-level specs. If every Steam player were to get a PS4 Pro, it would be an upgrade for over 60% of them, and 70% of them would be getting an upgrade with the Xbox One X. Sure, you could still make the argument that when you pay more for PC parts, you get a better experience than you could with a console. We can argue all day about budget PCs, but a console can’t match up to a $1,000 PC build. It’s the same as paying more for car parts, in the end you get a better car. However, there is a certain problem with that…
“You pay a little more for a PC, you get much more quality.”
The idea here is that the more you pay for PC parts, the performance increases at a faster rate than the price does. Problem: that’s not how technology works. Paying twice as much doesn’t get you twice the quality the majority of the time. For example, let’s look at graphics cards, specifically the GeForce 10 series cards, starting with the GTX 1050.
1.35 GHz base clock
2 GB VRAM
This is our reference, our basis of comparison. Any percentages will be based on the 1050’s specs. Now let’s look at the GTX 1050 Ti, the 1050’s older brother.
1.29 GHz base clock
4 GB VRAM
This is pretty good. You only increase the price by about 27%, and you get an 11% increase in floating point speed and a 100% increase (double) in VRAM. Sure you get a slightly lower base clock, but the rest definitely makes up for it. In fact, according to GPU boss, the Ti managed 66 fps, or a 22% increase in frame rate for Battlefield 4, and a 54% increase in mHash/second in bitcoin mining. The cost increase is worth it, for the most part. But let’s get to the real meat of it; what happens when we double our budget? Surely we should see a massive increase performance, I bet some of you are willing to bet that twice the cost means more than twice the performance. The closest price comparison for double the cost is the GTX 1060 (3 GB), so let’s get a look at that.
1.5 GHz base clock
3 GB VRAM
Well… not substantial, I’d say. About a 50% increase in floating point speed, an 11% increase in base clock speed, and a 1GB decrease in VRAM. For [almost] doubling the price, you don’t get much. Well surely raw specs don’t tell the full story, right? Well, let’s look at some real wold comparisons. Once again, according to GPU Boss, there’s a 138% increase in hashes/second for bitcoin mining, and at 99 fps, an 83% frame rate increase in Battlefield 4. Well, then, raw specs does not tell the whole story! Here’s another one, the 1060’s big brother… or, well, slightly-more-developed twin.
1.5 GHz base clock
6 GB VRAM
Seems reasonable, another $50 for a decent jump in power and double the memory! But, as we’ve learned, we shouldn’t look at the specs for the full story. I did do a GPU Boss comparison, but for the BF4 frame rate, I had to look at Tom’s Hardware (sorry miners, GPU boss didn’t cover the mHash/sec spec either). What’s the verdict? Well, pretty good, I’d say. With 97 FPS, a 79% increase over the 1050— wait. 97? That seems too low… I mean, the 3GB version got 99. Well, let’s see what Tech Power Up has to say... 94.3 fps. 74% increase. Huh. Alright alright, maybe that was just a dud. We can gloss over that I guess. Ok, one more, but let’s go for the big fish: the GTX 1080.
1.6 GHz base clock
8 GB VRAM
That jump in floating point speed definitely has to be something, and 4 times the VRAM? Sure it’s 5 times the price, but as we saw, raw power doesn’t always tell the full story. GPU Boss returns to give us the run down, how do these cards compare in the real world? Well… a 222% (over three-fold) increase in mHash speed, and a 218% increase in FPS for Battlefield 4. That’s right, for 5 times the cost, you get 3 times the performance. Truly, the raw specs don’t tell the full story. You increase the cost by 27%, you increase frame rate in our example game by 22%. You increase the cost by 83%, you increase the frame rate by 83%. Sounds good, but if you increase the cost by 129%, and you get a 79% (-50% cost/power increase) increase in frame rate. You increase it by 358%, and you increase the frame rate by 218% (-140% cost/power increase). That’s not paying “more for much more power,” that’s a steep drop-off after the third cheapest option. In fact, did you know that you have to get to the 1060 (6GB) before you could compare the GTX line to a PS4 Pro? Not to mention that at $250, the price of a 1060 (6GB) you could get an entire PS4 Slim bundle, or that you have to get to the 1070 before you beat the Xbox One X. On another note, let’s look at a PS4 Slim…
800 MHz base clock
8 GB VRAM
…Versus a PS4 Pro.
911 MHz base clock
8 GB VRAM
128% increase in floating point speed, 13% increase in clock speed, for a 25% difference in cost. Unfortunately there is no Battlefield 4 comparison to make, but in BF1, the frame rate is doubled (30 fps to 60) and the textures are taken to 11. For what that looks like, I’ll leave it up to this bloke. Not to even mention that you can even get the texture buffs in 4K. Just like how you get a decent increase in performance based on price for the lower-cost GPUs, the same applies here. It’s even worse when you look at the CPU for a gaming PC. The more money you spend, again, the less of a benefit you get per dollar. Hardware Unboxed covers this in a video comparing different levels of Intel CPUs. One thing to note is that the highest i7 option (6700K) in this video was almost always within 10 FPS (though for a few games, 15 FPS) of a certain CPU in that list for just about all of the games. …That CPU was the lowest i3 (6100) option. The lowest i3 was $117 and the highest i7 was $339, a 189% price difference for what was, on average, a 30% or less difference in frame rate. Even the lowest Pentium option (G4400, $63) was often able to keep up with the i7. The CPU and GPU are usually the most expensive and power-consuming parts of a build, which is why I focused on them (other than the fact that they’re the two most important parts of a gaming PC, outside of RAM). With both, this “pay more to get much more performance” idea is pretty much the inverse of the truth.
“The console giants are bad for game developers, Steam doesn't treat developers as bad as Microsoft or especially Sony.”
Now one thing you might’ve heard is that the PS3 was incredibly difficult for developers to make games for, which for some, fueled the idea that console hardware is difficult too develop on compared to PC… but this ignores a very basic idea that we’ve already touched on: if the devs don’t want to make the game compatible with a system, they don’t have to. In fact, this is why Left 4 Dead and other Valve games aren’t on PS3, because they didn’t want to work with it’s hardware, calling it “too complex.” This didn’t stop the game from selling well over 10 million units worldwide. If anything, this was a problem for the PS3, not the dev team. This also ignores that games like LittleBigPlanet, Grand Theft Auto IV, and Metal Gear Solid 4 all came out in the same year as Left 4 Dead (2008) on PS3. Apparently, plenty of other dev teams didn’t have much of a problem with the PS3’s hardware, or at the very least, they got used to it soon enough. On top of that, when developing the 8th gen consoles, both Sony and Microsoft sought to use CPUs that were easier for developers, which included making decisions that considered apps for the consoles’ usage for more than gaming. On top of that, using their single-chip proprietary CPUs is cheaper and more energy efficient than buying pre-made CPUs and boards, which is far better of a reason for using them than some conspiracy about Sony and MS trying to make devs' lives harder. Now, console exclusives are apparently a point of contention: it’s often said that exclusive can cause developers to go bankrupt. However, exclusivity doesn’t have to be a bad thing for the developer. For example, when Media Molecule had to pitch their game to a publisher (Sony, coincidentally), they didn’t end up being tied into something detrimental to them. Their initial funding lasted for 6 months. From then, Sony offered additional funding, in exchange for Console Exclusivity. This may sound concerning to some, but the game ended up going on to sell almost 6 million units worldwide and launched Media Molecule into the gaming limelight. Sony later bought the development studio, but 1: this was in 2010, two years after LittleBigPlanet’s release, and 2: Media Molecule seem pretty happy about it to this day. If anything, signing up with Sony was one of the best things they could’ve done, in their opinion. Does this sound like a company that has it out for developers? There are plenty of examples that people will use to put Valve in a good light, but even Sony is comparatively good to developers.
“There are more PC gamers.”
The total number of active PC gamers on Steam has surpassed 120 million, which is impressive, especially considering that this number is double that of 2013’s figure (65 million). But the number of monthly active users on Xbox Live and PSN? About 120 million (1, 2) total. EDIT: You could argue that this isn't an apples-to-apples comparison, sure, so if you want to, say, compare the monthly number of Steam users to console? Steam has about half of what consoles do, at 67 million. Now, back to the 65 million total user figure for Steam, the best I could find for reference for PlayStation's number was an article giving the number of registered PSN accounts in 2013, 150 million. In a similar 4-year period (2009 - 2013), the number of registered PSN accounts didn’t double, it sextupled, or increased by 6 fold. Considering how the PS4 is already at 2/3 of the number of sales the PS3 had, even though it’s currently 3 years younger than its predecessor, I’m sure this trend is at least generally consistent. For example, let’s look at DOOM 2016, an awesome faced-paced shooting title with graphics galore… Of course, on a single platform, it sold best on PC/Steam. 2.36 million Steam sales, 2.05 million PS4 sales, 1.01 million Xbox One sales. But keep in mind… when you add the consoles sales together, you get over 3 million sales on the 8th gen systems. Meaning: this game was best sold on console. In fact, the Steam sales have only recently surpassed the PS4 sales. By the way VG charts only shows sales for physical copies of the games, so the number of PS4 and Xbox sales, when digital sales are included, are even higher than 3 million. This isn’t uncommon, by the way. Even with the games were the PC sales are higher than either of the consoles, there generally are more console sales total. But, to be fair, this isn’t anything new. The number of PC gamers hasn’t dominated the market, the percentages have always been about this much. PC can end up being the largest single platform for games, but consoles usually sell more copies total. EDIT: There were other examples but... Reddit has a 40,000-character limit.
This isn’t to say that there’s anything wrong with PC gaming, and this isn’t to exalt consoles. I’m not here to be the hipster defending the little guy, nor to be the one to try to put down someone/thing out of spite. This is about showing that PCs and consoles are overall pretty similar because there isn’t much dividing them, and that there isn’t anything wrong with being a console gamer. There isn’t some chasm separating consoles and PCs, at the end of the day they’re both computers that are (generally) designed for gaming. This about unity as gamers, to try to show that there shouldn’t be a massive divide just because of the computer system you game on. I want gamers to be in an environment where specs don't separate us; whether you got a $250 PS4 Slim or just built a $2,500 gaming PC, we’re here to game and should be able to have healthy interactions regardless of your platform. I’m well aware that this isn’t going to fix… much, but this needs to be said: there isn’t a huge divide between the PC and consoles, they’re far more similar than people think. There are upsides and downsides that one has that the other doesn’t on both sides. There’s so much more I could touch on, like how you could use SSDs or 3.5 inch hard drives with both, or that even though PC part prices go down over time, so do consoles, but I just wanted to touch on the main points people try to use to needlessly separate the two kinds of systems (looking at you PCMR) and correct them, to get the point across. I thank anyone who takes the time to read all of this, and especially anyone who doesn’t take what I say out of context. I also want to note that, again, thisisn’t “anti-PC gamer.” If it were up to me, everyone would be a hybrid gamer. Cheers.
This is a little long, IDK if anyone will read this, but i feel like this needs to be said. When my friends ask me why I’m in crypto, along with the upside of the technology I am very very wary of the future of the dollar. I’ll go into detail shortly but first here here are some other entities/people that share my same opinion.
Why is the dollar in trouble? It is how the dollar is produced, how they get into circulation, this is done through government borrowing and fractional reserve lending. For this post I am going to focus on Government borrowing. A lot of people do not know how dollars are created, my parents don’t, my friends don’t, I remember asking teachers in high school this, and they did not know. We are not on a gold standard, the dollar is backed by absolutely nothing, dollars get into circulation through the government borrowing them from the federal reserve (FED) along with fractional lending (also a farce). When the government borrows from the FED, there FED prints the dollars and gives them to the government. The money did not exist before hand, and is created to give to the government. The government then owes this money back, plus interest. However the government never pays back the original amount, only pays the interest, each year the government will borrow more money. This would be equal to you or me getting a credit card, maxing it out, then getting another one, and repeating the process forever. The debt ceiling is a joke, it is not a ceiling, but a budget on how much deficit spending the government will incur during that current year. The whole act of the US not increasing it is a show. If we ever chose not to raise the debt ceiling, the US will default on its debt which would be the end of days for financial markets. This is not an option, the US will never default on its debt, and thus it will continue borrowing forever, it’s the only option. This has an inflationary effect on the dollar, the amount of interest each year the government pays on the debt will increase (assuming the interest rates are the same), along with the amount of total national debt increasing. WE ARE PAYING FOR THIS. By the government borrowing more money and putting that money into circulation an inflation affect occurs. There will be more money created, and more and more money will be created every year forever. The US currently has close to $20Trillion in debt, not including outstanding liabilities (government pensions, social security, etc.) This will increase forever, the amount that will be needed to be borrowed by the government will increase forever, inflating the dollar forever. This increases exponentially. This is unavoidable, and there is literally no way out of this. Note: The inflation rate we see is the commodities inflation rate, how much milk, bread, etc., increased in cost, not how much currency is in circulation. As if this wasn’t bad enough there are talks about removing the debt “ceiling” altogether, https://www.washingtonpost.com/news/wonk/wp/2017/09/07/trump-schumer-agree-to-pursue-plan-to-repeal-the-debt-ceiling/?utm_term=.6bd331865207 What will happen when the government no longer has a budget and is even more free to spend at will? More dollars will be created further diluting the value of the dollar. Note: The US government has “lost” 8.8 Trillion dollars over the past few years. They just don’t know where it went, maybe a secret space program that wasn’t accounted properly to hide it, maybe some dude committed fraud, we don’t know. http://www.cnn.com/2016/08/23/politics/us-army-audit-accounting-errors/index.html $6.5M in 2015 https://www.metabunk.org/debunked-rumsfeld-says-2-3-trillion-missing-from-the-pentagon.t165/ $2.3M in 2001. Crypto Currencies have a clear advantage over this system, not controlled by the government, not borrowed into existence with no hope to pay it back, it can move large sums of money within minutes, not subject to confiscation by anyone, and there is no central bank. Dimon says bitcoin is a fraud, and CNBC just linked bitcoin mining to funding NK’s nuclear program. Also ACH’s can now do same day transfers (weird right, seems like theyre doing this to compete with something) https://www.cnbc.com/2017/09/13/bitcoin-mining-a-new-way-for-north-korea-to-generate-funds-for-the-regime.html - It feels like the MSM propaganda machine is turning. https://www.cnbc.com/2017/09/13/bitcoin-mining-a-new-way-for-north-korea-to-generate-funds-for-the-regime.html - ACH details. The powers at be see whats coming, and what cryptos potential is. We spend our lives working for fiat currency that the governments essentially gets for free. Our lives are controlled by this, we go to work to buy a house, a car, are encouraged to take on debt for these things, we unwilling becoming bitches of this system. Trading hours of our lives for fiat currency that the government gets for free. This is bullshit, and its time we wake up to this and take a stand! I’ll leave this quote by henry ford as my ending comments It is well enough that people of the nation do not understand our banking and monetary system, for if they did, I believe there would be a revolution before tomorrow morning. - Henry Ford Please look up more information on this on your own. This is something i feel everyone should know. LONG LIVE CRYPTO.
A Chairman at World Economic Forumhttps://www.weforum.org/people/glenn-h-hutchins/https://archive.is/kubAYGlenn Hutchins is chairman of North Island and a co-founder of Silver Lake, the global leader in technology investing. He is a director of both AT&T and NASDAQ OMX; a director of the Federal Reserve Bank of New York; vice chairman of both the Brookings Institution and the Economic Club of New York; and a member of the Executive Committee of the New York Presbyterian Hospital. He is an owner and member of the Executive Committee of the Boston Celtics basketball team. Mr. Hutchins is a director of the Harvard Management Company, which is responsible for the Harvard University endowment, and co-chairman of the University’s capital campaign. He is also a board member of the Center for American Progress as well as a Fellow of the American Academy of Arts and Sciences. Previously, Mr. Hutchins served President Clinton in both the transition and the White House as a special advisor on economic and health-care policy. He was also previously chairman of the board of SunGard Data Systems, Inc. and Instinet, Inc. Mr. Hutchins and his wife, Debbie, founded the Hutchins Family Foundation which, among other projects, has created the Hutchins Center for African and African-American Research at Harvard University, which is chaired by Mr. Hutchins; the Hutchins Center on Fiscal and Monetary Policy at The Brookings Institution; and the Chronic Fatigue Initiative, which conducts basic research into the cause of chronic fatigue syndrome.
Advisory Board Larry Summers
Born in New Haven, Connecticut, Summers became a professor of economics at Harvard University in 1983. He left Harvard in 1991, working as the Chief Economist at the World Bank from 1991 to 1993. In 1993, Summers was appointed Undersecretary for International Affairs of the United States Department of the Treasury under the Clinton Administration. In 1995, he was promoted to Deputy Secretary of the Treasury under his long-time political mentor Robert Rubin. In 1999, he succeeded Rubin as Secretary of the Treasury. While working for the Clinton administration Summers played a leading role in the American response to the 1994 economic crisis in Mexico, the 1997 Asian financial crisis, and the Russian financial crisis. He was also influential in the American advised privatization of the economies of the post-Soviet states, and in the deregulation of the U.S financial system, including the repeal of the Glass-Steagall Act.
Following the end of Clinton's term, Summers served as the 27th President of Harvard University from 2001 to 2006. Summers resigned as Harvard's president in the wake of a no-confidence vote by Harvard faculty, which resulted in large part from Summers's conflict with Cornel West, financial conflict of interest questions regarding his relationship with Andrei Shleifer, and a 2005 speech in which he suggested that the under-representation of women in science and engineering could be due to a "different availability of aptitude at the high end," and less to patterns of discrimination and socialization.
After his departure from Harvard, Summers worked as a managing partner at the hedge fund D. E. Shaw & Co., and as a freelance speaker at other financial institutions, including Goldman Sachs, JPMorgan Chase, Citigroup, Merrill Lynch and Lehman Brothers. Summers rejoined public service during the Obama administration, serving as the Director of the White House United States National Economic Council for President Barack Obama from January 2009 until November 2010, where he emerged as a key economic decision-maker in the Obama administration's response to the Great Recession. After his departure from the NEC in December 2010, Summers has worked in the private sector and as a columnist in major newspapers. In mid-2013, his name was widely floated as the potential successor to Ben Bernanke as the Chairman of the Federal Reserve, though after pushback from the left, Obama eventually nominated Federal Reserve Vice-Chairwoman Janet Yellen for the position.
DCG of course is an investor in both Blockstream and BTCC. DCG's money comes from:
Bain Capital Group
New York Life
Novel TMT Ventures
Solon Mack Capital
The Whittemore Collection
HCM International Co
DCG also owns Coindesk. BTCC and Bitfury are the only two large mining pools who are outspoken in their support of Bitcoin Core. The Bitfury Group Leadership to Present at Clinton Global Initiative (https://archive.is/MWKee) Full Video (Begins at 32:00) “The Bitfury Group is proud to be the world’s leading full service Blockchain technology company, we are deeply honored to represent this innovation to an audience of extremely dedicated game-changers, and we look forward to highlighting our company’s groundbreaking ‘Blockchain for global good’ work at such an important event, said Smith. “From the White House to the Blockchain, I know this technology has the power to deliver inclusion and opportunity to millions, if not billions, of people around the world and I am so grateful to work for a company focused on such a principled vision.” Bitfury Lightning Implementation
ACINQ’s US Headquarters is in Vienna, Virginia, a small town of only 16,000. Why would a global financial firm choose to locate here? -- Feeder community into Washington, D.C. Has an orange line metro stop. -- Located in Fairfax County, VA. -- The US Federal Government is the #2 largest employer -- Booz Allen Hamilton (NSA front company) is #6 largest employer -- In fact, most of the top employers in Fairfax County are either US Federal Gov’t or companies that provide services to Federal Government -- The county is home to the headquarters of intelligence agencies such as the Central Intelligence Agency, National Geospatial-Intelligence Agency, and National Reconnaissance Office, as well as the National Counterterrorism Center and Office of the Director of National Intelligence.
Chairman: Avinash Vashistha
Former Chairman and CEO of Accenture in India
He has worked with numerous clients in Banking, Investment and Financial services - General Atlantic, Goldman Sachs, Warbug Pincus, JP Morgan Chase, Visa, Citi Ventures, Baird Capital, Norges Bank, UBS, AXA and has advised World Bank, IDB, ADB, USAID and other multi-lateral agencies over the last 20 years on country strategy and investments across Asia and Latin America.
From 1986-1993 he worked for Information Management Consultants (imc) Ltd as a Technical Consultant with various federal government agencies. McLean, Virginia
1993-2000 Technical Consultant for Freddie Mac, in McLean Virginia
From 2000-2007, President of InterPro Global in Maryland
From 2011-2012, Director of VibbleTV in Columbia, Maryland
From 2008-Present has been Executive Director at ACINQ and Managing Partner at Vine Management, both in Vienna, Virginia.
BitFury Enhances Its Advisory Board by Adding Former CFTC Chairman Dr. James Newsome and Renowned Global Thought Leader and President of the Institute for Liberty and Democracy Hernando de Soto (Businesswire) Bitfury Board of Directors Robert R Dykes
Former CFO at Juniper Networks from 2005-2007, which had an NSA backdoor added to router software.
Greg Maxwell spent “several years at Mozilla”, leaving in August 2014
The other board members include two Bitfury founders, and an investor. Bitfury Advisory Board James Newsome
Ex-chairman of CFTC
Dr. Newsome was nominated by President Clinton and confirmed by the Senate to be at first a Commissioner and later a Chairman of CFTC. As Chairman, Newsome guided the regulation of the nation’s futures markets. Additionally, Newsome led the CFTC’s regulatory implementation of the Commodity Futures Modernization Act of 2000 (CFMA). He also served as one of four members of the President’s Working Group for Financial Markets, along with the Secretary of the Treasury and the Chairmen of the Federal Reserve and the SEC. In 2004, Newsome assumed the role of President and Chief Executive Officer of the New York Mercantile Exchange (NYMEX) where he managed daily operations of the largest physical derivatives exchange in the world. Dr. Newsome is presently a founding partner of Delta Strategy Group, a full-service government affairs firm based in Washington, DC.
Hernando de Soto
Hernando de Soto heads the Institute for Liberty and Democracy, named by The Economist one of the two most important think tanks in the world. In the last 30 years, he and his colleagues at the ILD have been involved in designing and implementing legal reform programs to empower the poor in Africa, Asia, Latin America, the Middle East, and former Soviet nations by granting them access to the same property and business rights that the majority of people in developed countries have through the institutions and tools needed to exercise those rights and freedoms. Mr. de Soto also co-chaired with former US Secretary of State Madeleine Albright the Commission on Legal Empowerment of the Poor, and currently serves as honorary co-chair on various boards and organizations, including the World Justice Project. He is the author of “The Other Path: the Economic Answer to Terrorism”, and his seminal work “The Mystery of Capital: Why Capitalism Triumphs in the West and Fails Everywhere Else.”
Criticisms: -- In his 'Planet of Slums' Mike Davis argues that de Soto, who Davis calls 'the global guru of neo-liberal populism', is essentially promoting what the statist left in South America and India has always promoted—individual land titling. Davis argues that titling is the incorporation into the formal economy of cities, which benefits more wealthy squatters but is disastrous for poorer squatters, and especially tenants who simply cannot afford incorporation into the fully commodified formal economy. -- An article by Madeleine Bunting for The Guardian (UK) claimed that de Soto's suggestions would in some circumstances cause more harm than benefit, and referred to The Mystery of Capital as "an elaborate smokescreen" used to obscure the issue of the power of the globalized elite. She cited de Soto's employment history as evidence of his bias in favor of the powerful. https://www.theguardian.com/business/2000/sep/11/imf.commenthttp://www.slate.com/articles/news_and_politics/hey_wait_a_minute/2005/01/the_de_soto_delusion.html
Dr. Tomicah Tillemann is Director of the Bretton Woods II initiative. The initiative brings together a variety of long-term investors, with the goal of committing 1% of their assets to social impact investment and using investments as leverage to encourage global good governance. Tillemann served at the U.S. State Department in 2010 as the Senior Advisor on Civil Society and Emerging Democracies to Secretary Hillary Clinton and Secretary John Kerry. Tillemann came to the State Department as a speechwriter to Secretary Clinton in March 2009. Earlier, he worked for the Senate Foreign Relations Committee, where he was the principal policy advisor on Europe and Eurasia to Committee Chairmen, Senators Joe Biden and John Kerry. He also facilitated the work of the Senate's Subcommittee on European Affairs, then chaired by Senator Barack Obama. Tillemann received his B.A. magna cum laude from Yale University. He holds a Ph.D. with distinction from the School for Advanced International Studies at Johns Hopkins University (SAIS) where he also served as a graduate level instructor in American foreign policy.http://live.worldbank.org/node/8468https://archive.is/raDHA
Secretary Clinton appointed Tomicah Tillemann, Ph.D. as the State Department’s Senior Advisor for Civil Society and Emerging Democracies in October 2010. He continues his service under Secretary Kerry.
Mr. Tillemann and his team operate like venture capitalists, identifying ideas that can strengthen new democracies and civil society, and then bring together the talent, technology and resources needed to translate promising concepts into successful diplomacy. He and his team have developed over 20 major initiatives on behalf of the President and Secretary of State.
Mr. Tillemann came to the State Department as a speechwriter to Secretary Clinton in March 2009 and collaborated with her on over 200 speeches. Earlier, he worked for the Senate Foreign Relations Committee, where he was the principal policy advisor on Europe and Eurasia to Committee Chairmen, Senators Joe Biden and John Kerry. He also facilitated the work of the Senate's Subcommittee on European Affairs, then chaired by Senator Barack Obama. Mr. Tillemann’s other professional experience includes work with the White House Office of Media Affairs and five U.S. Senate and Congressional campaigns. He was a reporter with Reuters New Media and hosted a commercial radio program in Denver, Colorado.http://m.state.gov/md160354.htmhttps://www.newamerica.org/our-people/tomicah-tillemann/https://archive.is/u2yF0
Director of “Bretton Woods II” initiative at New America Foundation Bretton Woods was an international summit that led to the creation of the IMF and the IBRD, one of five members of The World Bank
Speaking to Clinton Global IntiativePrior to working at Edelman, my career has included serving as Deputy White House Press Secretary and Special Assistant to President Obama, Director of Public Affairs for the Office of the Director of National Intelligence, Director of Communications for the Senate Commerce, Science and Transportation Committee and its then Chairman Senator Rockefeller, Traveling Press Director for Secretary Hillary Clinton’s 2008 Presidential campaign, and Director of Communications for Secretary Madeleine K. Albright and her consulting firm, The Albright Group, LLC.https://medium.com/@jamieelizabethsmith/why-i-believe-in-the-blockchain-b19bf2014fab
Don Tapscott, co-author of the book “Blockchain Revolution,” hosted the meeting with his son and co-author Alex Tapscott at his family’s summer compound in Lake of Bays, Ontario. The group included some of blockchain’s biggest backers, including people with ties to IBM and JPMorgan. They considered ways to improve the governance and oversight of the technology behind the digital currency bitcoin as a way to fuel the industry’s growth. They included Jim Zemlin, executive director of the Linux Foundation; Brian Behlendorf, executive director of the Hyperledger Project, a blockchain supporter group that includes International Business Machines Corp., Airbus Group SE and JPMorgan Chase & Co.; and Ana Lopes, board member of the World Wide Web Foundation. Participants with blockchain industry ties include former deputy White House press secretary Jamie Smith, now chief global communications officer of BitFury Group Ltd., and Joseph Lubin, founder of startup Consensus Systems.
Was the founding director of the MIT Digital Currency Initiative -Left his 4 year post as White House Senior Advisor for Mobile and Data Innovation to go directly to the MIT DCI
Brian Forde has spent more than a decade at the nexus of technology, entrepreneurship, and public policy. He is currently the Director of Digital Currency at the MIT Media Lab where he leads efforts to mainstream digital currencies like Bitcoin through research, and incubation of high-impact applications of the emerging technology. Most recently he was the Senior Advisor for Mobile and Data Innovation at the White House where he spearheaded efforts to leverage emerging technologies to address the President’s most critical national priorities. Prior to his work at the White House, Brian founded one of the largest phone companies in Nicaragua after serving as a business and technology volunteer in the Peace Corps. In recognition of his work, Brian was named a Young Global Leader by the World Economic Forum and one of the ten most influential people in bitcoin and blockchain.https://www.linkedin.com/in/brianfordehttps://archive.is/WjEGU
Includes Accenture (See Avinash Vashistha), Allianz, Deloitte (Scaling Bitcoin platinum sponsor, Blockstream Partner), Citigroup, Bain & Company (parent of Bain Capital, DCG investor), Dalian Wanda Group (working on blockchain technology), Ernst & Young (see Paul Brody), HSBC (Li-Ka Shing, Blockstream investor, used to be Deputy Chairman of HSBC), IBM, KPMG International, Mastercard (DCG Investor), PwC (Blockstream partner, also sponsor of Scaling Bitcoin)
Future of Financial Services Report [PDF] The word “blockchain” is mentioned once in this document, on page 23 (http://i.imgur.com/1SxyneJ.png):We have identified three major challenge areas related to innovation in financial servicesthat will require multi-stakeholder collaboration to be addressed effectively. We are launching a project stream related to each area, with the goal of enabling tangible impact.... Decentralised systems, such as the blockchain protocol, threaten to disintermediate almost every process in financial services
Excerpt: BitFury - www.bitfury.com - The Bitfury Group develops and delivers software and hardware solutions for businesses, governments, organisations and individuals who want to securely move an asset across the Blockchain.
Bitcoin Core 0.10.0 released | Wladimir | Feb 16 2015
Wladimir on Feb 16 2015: Bitcoin Core version 0.10.0 is now available from: https://bitcoin.org/bin/0.10.0/ This is a new major version release, bringing both new features and bug fixes. Please report bugs using the issue tracker at github: https://github.com/bitcoin/bitcoin/issues The whole distribution is also available as torrent: https://bitcoin.org/bin/0.10.0/bitcoin-0.10.0.torrent magnet:?xt=urn:btih:170c61fe09dafecfbb97cb4dccd32173383f4e68&dn;=0.10.0&tr;=udp%3A%2F%2Ftracker.openbittorrent.com%3A80%2Fannounce&tr;=udp%3A%2F%2Ftracker.publicbt.com%3A80%2Fannounce&tr;=udp%3A%2F%2Ftracker.ccc.de%3A80%2Fannounce&tr;=udp%3A%2F%2Ftracker.coppersurfer.tk%3A6969&tr;=udp%3A%2F%2Fopen.demonii.com%3A1337&ws;=https%3A%2F%2Fbitcoin.org%2Fbin%2F Upgrading and downgrading How to Upgrade If you are running an older version, shut it down. Wait until it has completely shut down (which might take a few minutes for older versions), then run the installer (on Windows) or just copy over /Applications/Bitcoin-Qt (on Mac) or bitcoind/bitcoin-qt (on Linux). Downgrading warning Because release 0.10.0 makes use of headers-first synchronization and parallel block download (see further), the block files and databases are not backwards-compatible with older versions of Bitcoin Core or other software:
Blocks will be stored on disk out of order (in the order they are
received, really), which makes it incompatible with some tools or other programs. Reindexing using earlier versions will also not work anymore as a result of this.
The block index database will now hold headers for which no block is
stored on disk, which earlier versions won't support. If you want to be able to downgrade smoothly, make a backup of your entire data directory. Without this your node will need start syncing (or importing from bootstrap.dat) anew afterwards. It is possible that the data from a completely synchronised 0.10 node may be usable in older versions as-is, but this is not supported and may break as soon as the older version attempts to reindex. This does not affect wallet forward or backward compatibility. Notable changes Faster synchronization Bitcoin Core now uses 'headers-first synchronization'. This means that we first ask peers for block headers (a total of 27 megabytes, as of December 2014) and validate those. In a second stage, when the headers have been discovered, we download the blocks. However, as we already know about the whole chain in advance, the blocks can be downloaded in parallel from all available peers. In practice, this means a much faster and more robust synchronization. On recent hardware with a decent network link, it can be as little as 3 hours for an initial full synchronization. You may notice a slower progress in the very first few minutes, when headers are still being fetched and verified, but it should gain speed afterwards. A few RPCs were added/updated as a result of this:
getblockchaininfo now returns the number of validated headers in addition to
the number of validated blocks.
getpeerinfo lists both the number of blocks and headers we know we have in
common with each peer. While synchronizing, the heights of the blocks that we have requested from peers (but haven't received yet) are also listed as 'inflight'.
A new RPC getchaintips lists all known branches of the block chain,
including those we only have headers for. Transaction fee changes This release automatically estimates how high a transaction fee (or how high a priority) transactions require to be confirmed quickly. The default settings will create transactions that confirm quickly; see the new 'txconfirmtarget' setting to control the tradeoff between fees and confirmation times. Fees are added by default unless the 'sendfreetransactions' setting is enabled. Prior releases used hard-coded fees (and priorities), and would sometimes create transactions that took a very long time to confirm. Statistics used to estimate fees and priorities are saved in the data directory in the fee_estimates.dat file just before program shutdown, and are read in at startup. New command line options for transaction fee changes:
-txconfirmtarget=n : create transactions that have enough fees (or priority)
so they are likely to begin confirmation within n blocks (default: 1). This setting is over-ridden by the -paytxfee option.
-sendfreetransactions : Send transactions as zero-fee transactions if possible
(default: 0) New RPC commands for fee estimation:
estimatefee nblocks : Returns approximate fee-per-1,000-bytes needed for
a transaction to begin confirmation within nblocks. Returns -1 if not enough transactions have been observed to compute a good estimate.
estimatepriority nblocks : Returns approximate priority needed for
a zero-fee transaction to begin confirmation within nblocks. Returns -1 if not enough free transactions have been observed to compute a good estimate. RPC access control changes Subnet matching for the purpose of access control is now done by matching the binary network address, instead of with string wildcard matching. For the user this means that -rpcallowip takes a subnet specification, which can be
a single IP address (e.g. 188.8.131.52 or fe80::0012:3456:789a:bcde)
a network/CIDR (e.g. 184.108.40.206/24 or fe80::0000/64)
a network/netmask (e.g. 220.127.116.11/255.255.255.0 or fe80::0012:3456:789a:bcde/ffff:ffff:ffff:ffff:ffff:ffff:ffff:ffff)
An arbitrary number of -rpcallow arguments can be given. An incoming connection will be accepted if its origin address matches one of them. For example: | 0.9.x and before | 0.10.x | |--------------------------------------------|---------------------------------------| | -rpcallowip=192.168.1.1 | -rpcallowip=192.168.1.1 (unchanged) | | -rpcallowip=192.168.1.* | -rpcallowip=192.168.1.0/24 | | -rpcallowip=192.168.* | -rpcallowip=192.168.0.0/16 | | -rpcallowip=* (dangerous!) | -rpcallowip=::/0 (still dangerous!) | Using wildcards will result in the rule being rejected with the following error in debug.log:
Error: Invalid -rpcallowip subnet specification: *. Valid are a single IP (e.g. 18.104.22.168), a network/netmask (e.g. 22.214.171.124/255.255.255.0) or a network/CIDR (e.g. 126.96.36.199/24).
REST interface A new HTTP API is exposed when running with the -rest flag, which allows unauthenticated access to public node data. It is served on the same port as RPC, but does not need a password, and uses plain HTTP instead of JSON-RPC. Assuming a local RPC server running on port 8332, it is possible to request:
In every case, EXT can be bin (for raw binary data), hex (for hex-encoded binary) or json. For more details, see the doc/REST-interface.md document in the repository. RPC Server "Warm-Up" Mode The RPC server is started earlier now, before most of the expensive intialisations like loading the block index. It is available now almost immediately after starting the process. However, until all initialisations are done, it always returns an immediate error with code -28 to all calls. This new behaviour can be useful for clients to know that a server is already started and will be available soon (for instance, so that they do not have to start it themselves). Improved signing security For 0.10 the security of signing against unusual attacks has been improved by making the signatures constant time and deterministic. This change is a result of switching signing to use libsecp256k1 instead of OpenSSL. Libsecp256k1 is a cryptographic library optimized for the curve Bitcoin uses which was created by Bitcoin Core developer Pieter Wuille. There exist attacks against most ECC implementations where an attacker on shared virtual machine hardware could extract a private key if they could cause a target to sign using the same key hundreds of times. While using shared hosts and reusing keys are inadvisable for other reasons, it's a better practice to avoid the exposure. OpenSSL has code in their source repository for derandomization and reduction in timing leaks that we've eagerly wanted to use for a long time, but this functionality has still not made its way into a released version of OpenSSL. Libsecp256k1 achieves significantly stronger protection: As far as we're aware this is the only deployed implementation of constant time signing for the curve Bitcoin uses and we have reason to believe that libsecp256k1 is better tested and more thoroughly reviewed than the implementation in OpenSSL.  https://eprint.iacr.org/2014/161.pdf Watch-only wallet support The wallet can now track transactions to and from wallets for which you know all addresses (or scripts), even without the private keys. This can be used to track payments without needing the private keys online on a possibly vulnerable system. In addition, it can help for (manual) construction of multisig transactions where you are only one of the signers. One new RPC, importaddress, is added which functions similarly to importprivkey, but instead takes an address or script (in hexadecimal) as argument. After using it, outputs credited to this address or script are considered to be received, and transactions consuming these outputs will be considered to be sent. The following RPCs have optional support for watch-only: getbalance, listreceivedbyaddress, listreceivedbyaccount, listtransactions, listaccounts, listsinceblock, gettransaction. See the RPC documentation for those methods for more information. Compared to using getrawtransaction, this mechanism does not require -txindex, scales better, integrates better with the wallet, and is compatible with future block chain pruning functionality. It does mean that all relevant addresses need to added to the wallet before the payment, though. Consensus library Starting from 0.10.0, the Bitcoin Core distribution includes a consensus library. The purpose of this library is to make the verification functionality that is critical to Bitcoin's consensus available to other applications, e.g. to language bindings such as [python-bitcoinlib](https://pypi.python.org/pypi/python-bitcoinlib) or alternative node implementations. This library is called libbitcoinconsensus.so (or, .dll for Windows). Its interface is defined in the C header [bitcoinconsensus.h](https://github.com/bitcoin/bitcoin/blob/0.10/src/script/bitcoinconsensus.h). In its initial version the API includes two functions:
bitcoinconsensus_verify_script verifies a script. It returns whether the indicated input of the provided serialized transaction
correctly spends the passed scriptPubKey under additional constraints indicated by flags
bitcoinconsensus_version returns the API version, currently at an experimental 0
The functionality is planned to be extended to e.g. UTXO management in upcoming releases, but the interface for existing methods should remain stable. Standard script rules relaxed for P2SH addresses The IsStandard() rules have been almost completely removed for P2SH redemption scripts, allowing applications to make use of any valid script type, such as "n-of-m OR y", hash-locked oracle addresses, etc. While the Bitcoin protocol has always supported these types of script, actually using them on mainnet has been previously inconvenient as standard Bitcoin Core nodes wouldn't relay them to miners, nor would most miners include them in blocks they mined. bitcoin-tx It has been observed that many of the RPC functions offered by bitcoind are "pure functions", and operate independently of the bitcoind wallet. This included many of the RPC "raw transaction" API functions, such as createrawtransaction. bitcoin-tx is a newly introduced command line utility designed to enable easy manipulation of bitcoin transactions. A summary of its operation may be obtained via "bitcoin-tx --help" Transactions may be created or signed in a manner similar to the RPC raw tx API. Transactions may be updated, deleting inputs or outputs, or appending new inputs and outputs. Custom scripts may be easily composed using a simple text notation, borrowed from the bitcoin test suite. This tool may be used for experimenting with new transaction types, signing multi-party transactions, and many other uses. Long term, the goal is to deprecate and remove "pure function" RPC API calls, as those do not require a server round-trip to execute. Other utilities "bitcoin-key" and "bitcoin-script" have been proposed, making key and script operations easily accessible via command line. Mining and relay policy enhancements Bitcoin Core's block templates are now for version 3 blocks only, and any mining software relying on its getblocktemplate must be updated in parallel to use libblkmaker either version 0.4.2 or any version from 0.5.1 onward. If you are solo mining, this will affect you the moment you upgrade Bitcoin Core, which must be done prior to BIP66 achieving its 951/1001 status. If you are mining with the stratum mining protocol: this does not affect you. If you are mining with the getblocktemplate protocol to a pool: this will affect you at the pool operator's discretion, which must be no later than BIP66 achieving its 951/1001 status. The prioritisetransaction RPC method has been added to enable miners to manipulate the priority of transactions on an individual basis. Bitcoin Core now supports BIP 22 long polling, so mining software can be notified immediately of new templates rather than having to poll periodically. Support for BIP 23 block proposals is now available in Bitcoin Core's getblocktemplate method. This enables miners to check the basic validity of their next block before expending work on it, reducing risks of accidental hardforks or mining invalid blocks. Two new options to control mining policy:
-datacarrier=0/1 : Relay and mine "data carrier" (OP_RETURN) transactions
if this is 1.
-datacarriersize=n : Maximum size, in bytes, we consider acceptable for
"data carrier" outputs. The relay policy has changed to more properly implement the desired behavior of not relaying free (or very low fee) transactions unless they have a priority above the AllowFreeThreshold(), in which case they are relayed subject to the rate limiter. BIP 66: strict DER encoding for signatures Bitcoin Core 0.10 implements BIP 66, which introduces block version 3, and a new consensus rule, which prohibits non-DER signatures. Such transactions have been non-standard since Bitcoin v0.8.0 (released in February 2013), but were technically still permitted inside blocks. This change breaks the dependency on OpenSSL's signature parsing, and is required if implementations would want to remove all of OpenSSL from the consensus code. The same miner-voting mechanism as in BIP 34 is used: when 751 out of a sequence of 1001 blocks have version number 3 or higher, the new consensus rule becomes active for those blocks. When 951 out of a sequence of 1001 blocks have version number 3 or higher, it becomes mandatory for all blocks. Backward compatibility with current mining software is NOT provided, thus miners should read the first paragraph of "Mining and relay policy enhancements" above. 0.10.0 Change log Detailed release notes follow. This overview includes changes that affect external behavior, not code moves, refactors or string updates. RPC:
f923c07 Support IPv6 lookup in bitcoin-cli even when IPv6 only bound on localhost
b641c9c Fix addnode "onetry": Connect with OpenNetworkConnection
State agency says MIT student Bitcoin mining software secretly accessed computers. Andrew Ford @AndrewFordNews. View Comments. The New Jersey attorney general said a piece of software that would ... The company’s large-scale mining operations are some of the best in the world. It is estimated that BitFury has mined over 600,000 bitcoin to date. The company also pays no tax on its 18 ... C4 Mining Co. plans to operate a bitcoin mining operation using bitcoin mining machines like the Bitmain Antminter S9, as seen in Las Vegas on Wednesday, March 14, 2018. Bitcoin Core’s block templates are now for version 3 blocks only, and any mining software relying on its getblocktemplate must be updated in parallel to use libblkmaker either version 0.4.2 or any version from 0.5.1 onward. If you are solo mining, this will affect you the moment you upgrade Bitcoin Core, which must be done prior to BIP66 achieving its 951/1001 status. If you are mining with ... Digital money that’s instant, private, and free from bank fees. Download our official wallet app and start using Bitcoin today. Read news, start mining, and buy BTC or BCH.
7 DAY$-24/HR$ - BITCOIN MINING EXPERIMENT - See How Much ...
In 2014, before Ethereum and altcoin mania, before ICOs and concerns about Tether and Facebook's Libra, Motherboard gained access to a massive and secretive ... I'm sharing information about cpuwin, cpuwin mining, sharing content. How do you get bitcoin mining on this videotape, such as the minerality of the ethereum mining. This video goes over my 7 day 1 week Bitcoin Mining experiment. I let my computer Mine for Bitcoin for a week straight, to see how much money I could generat... The virtual goldrush to mine Bitcoin and other cryptocurrencies leads us to Central Washington state where a Bitcoin mine generates roughly $70,000 a day min... Antminer S7 the best bitcoin miner machine Subscribe: https://goo.gl/4c4zVk Lepa 1600 80 PLUS GOLD G1600-MA Power Supply link: http://adf.ly/1R3Czk