Buying a PC: Need advice!

I don’t think you know what the Xbox X’s internals are, it has 2 more cores on the CPU and a faster GPU that’s intergrated into the CPU so it has no latencay from having to travel through the bus.

It also has 12gb of DDR5 RAM that rig has DDR4…which is half the speed.

I don’t think you know what is inside the XBOX 1X. It keeps the 8 CPU cores just with a slight overclock. The Jaguar cores on the XBOX and playstation were AMD’s abortive mobile market entry so they are not good.

And it has 12 GB of GDDR5 not DDR5. GDDR 5 is almost 3x faster than DDR4 but it doesn’t do the same things.- It has higher latencies for one and it is prepared to take data in a different way. So even though you have a lack of latency (which is virtually non existent in a PC anyway) You have both CPU and GPU making calls for the same memory and the CPU will slow things down. That CPU is the reason that you lack games at 1080p 60fps on both the XBOX as in the playstation.

Finally, its 40 Polaris CU are clocked slightly below the RX580 36 CU hence both have the same TFLOPS. Which is actually useful here because we are talking about the same manufacture with the same architecture so we can see where the RX580 stands in the market today and if we look at several benchmarks, it is sitting between a GTX 970 and a 980… Like I have been saying all the time. And if we consider the kind of overclock some 970s are able to achieve (seriously, those cards were ridiculously good overclockers. I should build a PC with one…) You can bet that a 970 will most definitely keep up in some games like this one or witcher 3 where the CPU has to work a lot.

Not to mention that “the rig” has 8 GB of dedicated GDDR 5 which will only do graphical tasks and the DDR4 will fill the CPU needs without any trouble because it is better at it. It is what it is built for. If GDDR5 was the be all end all, you can bet that most manufacturers would make GDDR5 compatible motherboards and most desktop enthusiasts (me included) would have builds with it instead of DDR4.

So, I will say again, nothing against consoles. But, there are no miracles. If you pay 500 dollars, you get 500 dollars worth of hardware. You can’t cheat around it.

GDDR5 memory is significantly faster than DDR4 when it comes to graphics and gaming applications. It has a wider memory bus and can handle read and writes in the same clock cycle.

Xbox x doesn’t multi task like a PC so it doesn’t need DDR4 RAM, it has 4gb dedicated for multitasking as it is. That pc is not significantly better than an X.

I meant locking as is locking the cards from crypto miners bud

It is not as linear as that. Yes, the frequency speed is faster (for now), but the latency is a lot higher. One is designed for extremely low latency operations like when you are using the CPU to calculate what every citizen in Ratay is doing and it needs to constantly call their functions and check them, a low latency memory like DDR4 is ideal. The other (GDDR5) is made for high bandwidth so when you are loading all the textures and models of said villagers, the cores of the GPU are all fed by the data flow from the memory. Timings here are less crucial and what matters is volume of data.

And that PC is far ahead of an XBOX 1X. I will repeat again the no miracle mantra:
You spend $500, you get $500 performance.

@RealityKings69 Ahhhh. Gotcha. Well, they are a cash cow, for now. But what they are probably thinking is that this craze will eventually end (either by ROI going up or ASIC machines cost going down) and then you have gamers who went somewhere else to get their kit and that is market share they lose. Now I am not saying they will do it, I just think it might be possible.

And if I start upgrading, it’ll pretty much cause a chain reaction. With a new chip and mobo, I’ll probably end up with a new ssd. And then I’ll probably add another stick of ram in the basket. Then before you know it, a new gpu would’ve found its way into the basket. It’s like beer. You’ll need only one but you’ll buy the 6 pack anyway lol

1 Like

We can speculate about performances all day and compare it everything possible, all I know is that it achieves native 4K on most of the games, which somehow is a proof of what it can achieve.

Like I stated earlier, it only takes looking at the recommended specs for the games running on 4K on X to realize that it’s not that bad after all.

KCD is clearly not a good example for performance, it’s badly optimized at the moment and barely runs on the average PC.

The Witcher 3 runs at native 4K on X, I’ll post a link for the requirements based on hardware performance and prices to achieve it here, it speaks more than a thousand words.

For 1440p and 4K on Ultra settings, you’re better off with dual GTX 1070s or one or two GTX 1080.

You are wrong. Though the XBOX 1x achieves 4k in less demanding areas like looking at the sky, it uses dynamic resolution to keep the 30 fps. And it uses a mix of medium and high settings. I have a 980ti and I played witcher 3 at 4k 60 fps with high and very high settings with a modified .ini and the game ran smooth as butter and is gorgeous. I even why as far as reducing some of the particle effects so I could increase the draw distance to basically the entire map and it was a glorious moment. Those specs you are showing are for 4k 60 fps ultra settings. There is nothing special about the XBOX 1X it is the best console around and if you want a 4k sofa experience, go for it. But you cannot compare it to a PC with a modern i5 and a 1070. That machine is way ahead in terms of capability. Heck an i5 2500k is a better CPU than the one on the XBOX. By a very wide margin.

Nope, you don’t seem very educated on the matter, even though you seem to strongly believe the contrary …

The Witcher 3 is running native 4K, it’s not dynamic resolution.
You have a performance mode that will scale down the resolution to keep up with higher FPS, but that’s it.

Some are running with dynamic resolution such as AC Origins, even Wolfenstein 2 has been updated and you can now switch between 2 modes, native 4K or dynamic resolution. And those are a few among many other.

Here is a list, but it’s outdated, now the list has expanded exponentially (and some games like Wolfenstein 2 have been updated with a native 4K mode by now) :

And it uses a mix of medium and high settings

That is true … but I’ll take 4K@30fps at medium/high settings any day …

Heck an i5 2500k is a better CPU than the one on the XBOX

You are right … technically, but there is something you’re not understanding, or avoiding to understand on purpose.

Like explained earlier, everything inside the Xbox One X is custom made, every component is tailor made to work like an orchestra. Hell, they even tuned the voltage using the hovis method.

My point is that the architecture was optimized to the max for what it was engineered for, each part is like a Swiss watch and cannot work without each other. It’s a closed environment with highly optimized parts.

You keep comparing off the shelf parts to a completely custom made architecture that has no equal on the market, that is where I cannot agree.

If you take the custom APU and separate the CPU/GPU inside the Xbox One X and put it out of context, yes it’s going to perform poorly compared to any i5 on the market, but that’s not the point.

It’s an APU in the first place, so the CPU cannot work without the APU, they go in pairs. On top of that, you keep pointing out that it’s the same Jaguar 8 cores … which is wrong. It’s a custom CPU based on the Jaguar architecture, nowhere has Microsoft declared it’s a Jaguar CPU inside the Xbox One X, even the official spec sheet mentions “custom CPU”.

Please do some research first.

Second,

Those specs you are showing are for 4k 60 fps ultra settings.

Read it all, not only the parts you redeem interesting for your narrative …
Look at the sheet underneath and the tags “playable” and what they mean.

Now try to locate where the Xbox One X fits in there, you’re a smart man … I’m sure you’ll figure it out, or you might have figured it out already but will refuse to admit it.

I hope you will be reasonable, but I don’t expect much … you seem to be very elitist, and for some reason trying hard to downplay the X.

I am very well educated on the matter. Witcher 3 runs at 4k using dynamic resolution (from 1800p to 2160p) with a mix of medium to high settings. It is very impressive mind you but a 1070 will do it without dynamic resolution and with higher settings.

And I do prefer 4K though 60 fps is a must for me.

Like I explained earlier, there are no miracles. As for tailor made, I wouldn’t go that far. It is Jaguar CPU with Polaris GPU and these components were chosen for cost reasons and the reason they didn’t upgrade the 8 core from the XBOX 1 is that because it is a closed architecture, they can’t. They are the same Jaguar cores, with an overclock. No matter what microsoft’s marketing spiel is, it is what it is.

From the website you provided:

These benchmarks assume that all of the graphical settings in the game are set to their highest or turned on. You can get even better performance out of your PC if you adjust some of the graphical settings, but we explain that in more detail below.

So the table they provide has ludicrous things like 4K with 8x MSAA which of course brings most systems to their knees. As I told you, I had my own copy of the Witcher running at 4K 60 fps at a point. 2xFXAA Hairworks off and a mix of high/ultra on the rest according to my preferences, modded draw distance (where the 1X miserably fails because it is CPU bound) and getting 60 fps for the most part. Then, because I have a 1440p screen, I decided to keep at my native resolution because it doesn’t scale well to 4k. I have a 980TI.

I am an elitist? I said time and again, if you want to have an out of the box 4k sofa experience, go for it. But there are no miracles. You get what you pay for (wow… such elitism… much angst). You can only do so much and a 1070 is way better than an XBOX 1X and a computer with one, will run The Witcher 3 with the same settings as the X at 4K with 60 FPS. There you go, an apples to apples comparison.

I am very well educated on the matter. Witcher 3 runs at 4k using dynamic resolution (from 1800p to 2160p) with a mix of medium to high settings. It is very impressive mind you but a 1070 will do it without dynamic resolution and with higher settings.

Please provide source if you are going to dismiss my points.

The link I submitted has a list of all the native 4K games and a list of titles also using a form of checkerboarding or dynamic scaling.

And I persist and sign, the witcher 3 is running native 4k (3840x2160p) 100% of the time on X and it’s not using dynamic resolution …

There is a native 4K mode and a “performance mode” that will use dynamic scaling, but the game runs at native 4K 100% of the time if you use the native 4K mode.

Like I explained earlier, there are no miracles.

It’s not a miracle, I’m tring to explain a basic concept to you.

If you cannot understand the simple way, lets do it the hard way then : provide me with a full PC spec list that can run all these games in native 4K (and again, NOT DYNAMIC SCALING, but true native 4K 100% of the time) for under $500.

Lets make things clear here :

1.Don’t show up with second hand parts … that would be embarrassing
2.I’m talking about native 4K and playable out of the box … Not tweaking, overclocking and whatnot …
3.X just needs to be plugged to your TV and offers native 4K on high/medium out of the box, don’t show up with a CPU and GPU combo, alone it cannot play video games and doesn’t make sense (keyboard + casing + keyboard/mouse + PSU etc…)

I am an elitist?

this is the second topic on this very forum where you downplay anyone bringing up Xbox One X, even laughing at people for suggesting it.

You are stubborn and very poorly educated on the machine’s capabilities, you are often caught using misinformation to dismiss the system or downplay it … interpret that as you will, but yes, I honestly believe you are a PC elitist.

You also have a bad habit of using your own logic as some sort of proof, dismissing other people’s arguments but not bringing any tangible backing to your own.

So please, if you are going to answer to this with other biased statements, provide benchmarks with settings/specs and official statements of the arguments you bring to the table.

As of now, I’m waiting for an official spec sheet where it is specifically stated that the Xbox One X features a Jaguar CPU. And the fact that The witcher 3 is using dynamic scaling on X and is not running in native 4K.

I’m waiting.

In Xbox One X’s case we’re getting a huge boost in pixels rendered with this mode, and even though it’s capped at a familiar 30fps ceiling, the engine is capable of hitting 3840x2160 in less taxing areas. Look to the sky, wander around interiors, or explore less GPU-intensive environments, and the game has never looked sharper. However, the catch is that resolution drops in more intensive areas. As an example, galloping around the notorious Crookback Bog - an area renowned for its impact on performance - sees the image resolve to a lower 3200x1800.

This is roughly a 20% reduction in pixel output in order to keep 30fps.

Besides that not being the point at all of this topic (the point being that OP wanted the opinion on a PC he is thinking of buying), You want me to provide a spec list where you impose severe limitations on the choice of hardware I can bring. At the moment, it is impossible to find a good solution since GPU prices are super inflated thanks to all the miners. Plus, the tweaking and overclocking is free performance. It is possible to do 4k without it though and I would assemble something like a Rysen or an I3 with an RX480 or a 580 depending on prices. If I budget around 200 for the GPU (original price point of the RX480/580), I get around 300 left for the rest of the build which is possible.
But at this point in time, new GPUs are too expensive.

It is the second time a topic about PCs gets derailed with people suggesting the OP buys an XBOX 1 X. Further, I have used only the specs that the XBOX 1 X has and that the PC OP is thinking of buying and I say that yes, this PC is way better than an XBOX 1X- There is no comparison possible.

Here is one of the many websites listing the internals of all the consoles. Read for yourself or do you want me to do that too?

TL:DR, it says here as in every other source on the planet, that the XBOX 1X has 8 overclocked Jaguar cores. Wait! They overclocked them!!! Does that mean that the XBOX 1X is going against your rules?

Now finally, I am going to go downstairs, grab a bottle of juice and play some FH3 on my XBOX 1S.

Git.

No, it is a custom made 8-cores CPU, like stated in all official spec sheets.

https://www.xbox.com/en-us/xbox-one-x

If you have other information, please provide link.

Concerning the GPU, here, for your education :

The Scorpio GPU inside the Xbox One X isn’t completely identical to a desktop Polaris GPU, however. It is still essentially using the same AMD Graphics Core Next (GCN) 4.0 technology despite being built on TSMC’s 16nm production process, as opposed to the 14nm one used inside the 400- and 500-series Radeon cards.
What it does have is more compute units than the top-end AMD RX 580 Polaris card. The RX 580 has 36 CUs, delivering 2,304 GCN cores, while the Scorpio has a full 40 CUs and therefore 2,560 GCN cores. That’s a good chunk more graphics processing silicon that we know AMD could have built into a functional graphics card themselves. So we know they could have built a Polaris GPU with 40 GCN cores using the 16nm process, so you’d have to believe AMD would have been able to at least match that using their 14nm lithography. And probably push it a little further to incorporate a couple more CUs for shizzles and gizzles.

The GPU inside the X is more beefy than a RX 580 which is worth around $400 alone … make what you want from this.

This is roughly a 20% reduction in pixel output in order to keep 30fps.

Yep, using the performance mode … Do you even read what I write?
the performance mode is 1800p/2160p with higher framerates, that’s the point.

You want me to provide a spec list where you impose severe limitations

I’m making this fair, period.

A CPU and GPU alone as a gaming proposition doesn’t make sense and only make sense for PC elitist that have no arguments. You cannot play video games with a CPU and GPU alone, and most of the time these choices and propositions are way more expensive than an Xbox One X.

It is the second time a topic about PCs gets derailed with people suggesting the OP buys an XBOX 1 X.

And? Then you derail it by using biased information and downplaying it.

TL:DR, it says here as in every other source on the planet, that the XBOX 1X has 8 overclocked Jaguar cores. Wait! They overclocked them!!! Does that mean that the XBOX 1X is going against your rules?

So Gamespot knows the specs better than the builder themselves …

This is where your reasoning is flawed, you think everything is an off the shelf part, you do not take into account custom silicone, provided you even know what that means … which is another debate.

You found one of the only source on the Internet saying “Jaguar” neither the official website nor the top tech websites mention it. Simply because it isn’t.

But I’m not surprised … i wasn’t expecting less…

Edit : This is my last answer for the sake of this topic … even though it is already too late and it has been butchered. My apologies to OP …

I am not sure if you are just being rude for the sake of it or actually trolling me. The eurogamer article I posted is talking about the 4K mode in that section not the “performance mode”.

I will quote it again:

Let’s talk numbers. The 4K mode is ambitious when considering a standard Xbox One largely ran the game at 1600x900. In Xbox One X’s case we’re getting a huge boost in pixels rendered with this mode, and even though it’s capped at a familiar 30fps ceiling, the engine is capable of hitting 3840x2160 in less taxing areas. Look to the sky, wander around interiors, or explore less GPU-intensive environments, and the game has never looked sharper. However, the catch is that resolution drops in more intensive areas. As an example, galloping around the notorious Crookback Bog - an area renowned for its impact on performance - sees the image resolve to a lower 3200x1800.

It is literally the second paragraph. If you ask for sources you could at least bother to read them.

Custom, can mean a lot of things. They customized it in order to get a higher clock speed for instance. But it is a Jaguar 8 core.
Here’s an ANANDTECH article on it:
https://www.anandtech.com/show/11992/the-xbox-one-x-review/3

I quote:

The original Xbox One featured eight CPU cores based on the AMD Jaguar microarchitecture, and the Xbox One X keeps that completely intact. There’s still eight cores, and they are still based on a slightly upgraded version of Jaguar. Microsoft stated the CPU performance increased 31% over the original console, and they achieved that with a frequency bump from 1.75 GHz to 2.3 GHz.

I can post 10 other reviews all saying the same. Same Jaguar architecture. They achieved 31% better performance because the clocks are 34% higher. Shocker…

AS I said in a previous post, the XBOX 1X has 40CU albeit at a slightly lower clock than the RX580. That gives both GPUs similar performance. If you calculate teraflops (which here is a valid comparison as these are cards from the same make using the same architecture) you can then see where the 580 fits in the current GPU lineup and it is between a 970 and 980. The price of the 580, as I said before, is grossly inflated. It had an MSRP at launch of $199 for the 4GB version. It was due to all the miners that the price has doubled. It costs almost the same as a 1070 with a lot less of the performance.

I am not using misinformation in my posts, just accurate description of the capabilities in each platform.
My objective was trying to inform the OP on his choice of computer and nothing else. I hope this discussion will at least prove useful to someone.

This is a great debate here.

Dude…the xbox one x maybe the most powerful console, but it does not compete with a mid to high range PC… Speaking of the Witcher 3 on xox, I found this on the first article I saw about it…

"However, the title also uses a dynamic resolution scaler on Xbox One X to maintain the frame rate at 30 FPS in “4K Mode”. "