Advertisement

Microsoft and the Xbox One. Thoughts?

Started by May 21, 2013 08:36 PM
267 comments, last by Hodgman 11 years, 7 months ago

Could ESRAM + DDR3 outperform GDDR5 for a typical video game?

While I certainly use my 360 a lot more to watch TV Episodes (Hulu Plus Mostly, don't have a subscription to a cable company) and movies than I use it to play any game, I do think they could have spent a bit more time on the technology in games. In my honest opinion the Xbox One Game Trailers didn't look much different that the same game trailers we saw when the 360 was announced.

The technology in games is exactly the same as a mid-level PC. What else is there to say?

Advertisement

They named it 'Xbox One'? What?

Yeah we joked about that a lot in the studio today. Don't confuse the XBox One with the XBox 1.

720 was such a better number than One. They can still change it back.

Just like they did with the Nintendo Revolution!

But seriously, god knows how many marketing gurus they tapped to come up with that name and the subsequent campaign around it... they aren't going to change it now.

As for the release, the cable thing was bizarre to me since I don't have cable either, but I do use my 360 for watching videos at least as much as I game on it. Switching between a game and netflix, that would be more interesting. As a note, it was nice to see an emphasis on speed in the UI, since the 360's Dashboard/Homescreen system menu was incredibly laggy for a dedicated platform if you were playing a game.

-Mark the Artist

Digital Art and Technical Design
Developer Journal

I don't have a problem with Microsoft boasting about the entertainment features because, if you think about it, consoles have been multi-purpose at least since the Sega Saturn debuted. (You'll recall it could play music CDs.) I do, however, have a problem with Microsoft not showing a single live demo, which Sony did (even though they technically cheated on the Watch Dogs demo by running it on a PC). I know E3 is around the corner, buy why waste our time?

One thing I like that Microsoft did is mentioned they were bundling the Kinect with the system, making it a standard device. I think it is foolish to release a new console these days without having speech recognition as a base feature. When Sony didn't say anything about this in the presentation, I thought they missed an opportunity. It think it's a practical feature.

Now, this is probably going to sound like a dumb question, but I'm going to ask it anyway because it's been bugging me:

If the Xbox One has a x86-based CPU, shouldn't it be possible to provide to provide backwards compatibility with the games from the original Xbox?

Disregard the part about BC. I was given some bad information.

Of all the rumored names, I thought 'Infinity' was the strongest, frankly, and caries the same connotation as "One", more or less in this (non-mathematical) context.

The name is lackluster, the industrial design is lackluster, hardware is good, but the platform will succeed on the back of its software stack and services.

My take on "One" is that they mean "one system to do it all." I'm going to disagree with you on "Infinity," because I think if you release a system with that name, it suggests there's nothing else you can release that will top it. I was hoping on "Gamma" myself, since it's the 3rd system and it sounds cool.

It's worth noting that the Xbox will be running three kernels on top of a hypervisor. That's what I'd be worried about for performance, not the raw hardware.

Actually, the hypervisor is one of the "three OSes" they mentioned, so its two OSes on top of a thin hypervisor. One to run AAA games, one to run the system software and apps, and which Ars is also reporting to also provide some resources to the games OS in some manner. Ars specifically mentions Kinect processing, which seems likely given that certain gestures are always recognized.

throw table_exception("(? ???)? ? ???");

The new kinnect tech is what interests me.

http://gizmodo.com/kinect-2-full-video-walkthrough-the-xbox-sees-you-like-509155673

Technically very impressive, and the idea of allowing developers to use biometric feedback opens some intriguing possibilities (L4D style AI director that knows when you're scared?)

But mostly, I find the idea of constantly connected camera that always watching and listening to me really, really creepy. The kinnect is mandatory and apparently, the console just won't work without it.

Now, my wardrobe is largely free of aluminium headware, but this just feels way wrong to me. Even assuming we can trust microsoft not sell this information (hey advertisers! would you like to know whether your ads provoke an emotional response?), there's always the possibility of it being hacked.

That said, we already have phones, tablets and laptops with mics and cameras, so maybe I am just being paranoid.

Either way, I saw nothing that will tempt me from my ivory tower of pc gaming. I mostly play with KBM, have an xbox controller for when I don't and I doubt there's enough money in the world to convince to join the

">great unwashed on xbox live.

if you think programming is like sex, you probably haven't done much of either.-------------- - capn_midnight
Advertisement

But mostly, I find the idea of constantly connected camera that always watching and listening to me really, really creepy. The kinnect is mandatory and apparently, the console just won't work without it.

And soon we'll have Google Glass. The future of always being watched and recorded is inevitable...

Could ESRAM + DDR3 outperform GDDR5 for a typical video game?

It could be better for certain kinds of problems, though GDDR5 can probably say the same for a different set of problems. Given the way that games are constructed today, and the balance of the kinds of problems involved, I'd say its likely to close to give anyone an edge. The difference, if any, will be that on PS4 what you see is what you get, whereas on the XBox One you have the option of moving things between the two and seeing what works best. You might come out ahead, or you might come out behind, but you have to make some effort. That, assumes, of course, that the memory is software managed and not just a massive, hardware-controlled cache. If that's the case, then what you see is what you get, just like Sony's system.

But I tend to think that the 32MB of ESRAM will mostly be consumed by various graphics buffers for much of its life, and as a scratchpad for intermediate results.

Mostly though, given that the XBox 360 was a unified GDDR system (it too had a EDRAM framebuffer-only memory of 10MB), that Microsoft went this route because they believe it will save them costs, especially over the lifetime of the console. DDR3 is cheap and available, and will be for many years to come, that's not so much the case with GDDR5 -- its expensive now, and its not going to get cheaper relative to DDR3. Since the ESRAM is on-chip, it'll carry less cost as Microsoft migrates to smaller silicon fabrication processes.

If its been effective at reducing the manufacturing costs, then they can launch at a lower price point and drop prices sooner to help gain market share. Or, just take home higher profit margins.

Playstation 4 is taking the path of being the system that gamers will want, while Xbox One is taking the path of being the system that, they think, everyone will want.

throw table_exception("(? ???)? ? ???");

Playstation 4 is taking the path of being the system that gamers will want, while Xbox One is taking the path of being the system that, they think, everyone will want.

I agree with this assessment but I'm baffled at the idea that they'll find a market out there for a device like this. Maybe some families will think it's a good compromise machine. But the Xbox brand has built on the loyalty of hardcore gamers and top-performing games for young men. It's too big a void to cross in my opinion, and I think they're only attempting it because their poor hardware choices have forced their hand.

Could ESRAM + DDR3 outperform GDDR5 for a typical video game?

...on the XBox One you might have the option of moving things between the two and seeing what works best.

But I tend to think that the 32MB of ESRAM will mostly be consumed by various graphics buffers for much of its life, and as a scratchpad for intermediate results.

Regarding the embedded RAM, we won't know the impact it will have until we know how it's used. As Ravyne says, maybe it'll be addressable as usable RAM, or maybe it will have a special fixed function.

On the 360, the embedded RAM could only be written to by ROP (the only place ROP could write to), and then data could only be copied from EDRAM to main memory (GDDRAM) in large chunks (graphical "resolve" operations). Or, in simpler terms: only render-targets could occupy EDRAM.

This is actually one of those quirks that developers have to deal with... When not in use, render-targets live in main RAM, but the only place you can draw to is EDRAM, so your engine has to hide the fact that when you bind a new render-target, if you want to draw over it's existing contents, you first have to draw a full-screen quad textured with the render-target's previous contents (to copy this data back into EDRAM), and when you're finished, the engine has to copy the EDRAM values back into main RAM.

This also placed a lot of restrictions on the resolution of the render-targets you could use, because they'd have to fit in EDRAM. A FP16 HDR buffer and a D24S8 buffer at 720p are ~10.5MiB, which doesn't fit... which means it's impossible to do FP16 HDR rendering (and depth buffering) at 720p on the 360, without resorting to rendering the screen in two passes and stitching them together.

My guess would be that the One's embedded RAM will be used for a similar fixed purpose, and won't be freely addressable RAM that the developer can use however they like.

Sony mentioned that they decided against using embedded RAM for this exact reason -- that although it has some benefits, it's a huge quirk that developers have to deal with (and the PS3 was quirk-central, so they've got some making up to do!).

Regarding DDR3 vs GDDR5: we have to wait and see what the cache miss times are like for each console specifically before we know for sure.

t will probably also depend on whether you're CPU-bound or GPU-bound.

P.S. does anyone else have the new consoles in their offices yet? I could go run some benchmarks, but I really don't have time, and I wouldn't be able to share the results anyway dry.png

This topic is closed to new replies.

Advertisement