Sam & Max graphics

13»

Comments

  • TorTor
    edited May 2010
    jmmontoro wrote: »
    i've never mentioned minimum requirements, i mentioned suggested requirements, which almost always is what is needed to run the game at full settings.
    I am not sure what games you play, but I have yet to find a single game where you can put the graphics to max. with the recommended settings.
    Yes, that's what I was thinking as well. My impression is that minimum means that the game is able to run, and recommended means something like "playable with reasonable performance and reasonable graphics quality".

    I looked up those requirements for Season 3, the only graphics-related information I could find was this: 128MB DirectX 8.1-compliant video card (256MB rec.)

    The thing is though, graphics memory doesn't matter nearly as much as the graphics processor when it comes to actual performance. It's unfortunate that Telltale doesn't mention a recommended GPU; I've seen a lot of other publishers omit that information as well.

    The graphics memory as part of the system requirements can be very misleading, because there are a lot of graphics cards with huge amounts of memory but a low end GPU. Memory is cheap, the GPU is expensive. It's a way for graphics card manufacturers to make their low-end models look better than they actually are, and to distinguish themselves from competing brands. It's kind of an underhanded move in my opinion. As such, the amount of graphics memory is almost useless as a predictor of performance.

    To provide actual useful information, publishers need to list specific GPUs in their system requirements--at least one from each of Intel (if supported), ATI and Nvidia; preferably one desktop GPU and one laptop GPU from each manufacturer. A few publishers do this, but not nearly enough of them.
  • edited May 2010
    Actually, now we're back on that, I can actually recall one game where the rec. requirements where full detail at 30FPS.
    Majesty 2.

    When that was told on the forums though most people went "WTF? That's a pretty weird interpretation of recommended".
  • edited May 2010
    I've always assumed "recommended" meant "to play it on the best settings". Seems kind of weird to me otherwise. I mean, makes sense having the lowest and the highest ones, but the lowest + some random point in the middle? Not really.
  • TorTor
    edited May 2010
    Actually, now we're back on that, I can actually recall one game where the rec. requirements where full detail at 30FPS.
    Majesty 2.
    Avistew wrote: »
    I've always assumed "recommended" meant "to play it on the best settings". Seems kind of weird to me otherwise. I mean, makes sense having the lowest and the highest ones, but the lowest + some random point in the middle? Not really.
    I agree that "best settings" would be most logical, and I'd like to see it work that way. I don't think that's the current state of affairs though.

    But that poses a new problem: What screen resolution are you running the game at? Resolution makes a huge impact on performance, so if the publisher asserts that a certain hardware configuration can run the game at the highest settings; they'd also have to specify which resolution that's possible at. There is no "maximum" screen resolution.

    You'd also have to have some kind of quantifiable measure of performance. Hassat mentioned FPS; but your experience at a certain number of frames per second will depend on the type of game: E.g. a game with largely static screens like a P&C adventure game will work great at low-ish FPS, but a first person shooter (where you wave the "camera" all over the place) needs much higher FPS to give the illusion of fluent motion. This also varies a bit from person to person, and other factors such as motion blur can change things.
  • edited May 2010
    Tor wrote: »
    But that poses a new problem: What screen resolution are you running the game at? Resolution makes a huge impact on performance, so if the publisher asserts that a certain hardware configuration can run the game at the highest settings; they'd also have to specify which resolution that's possible at. There is no "maximum" screen resolution.

    But wouldn't there be a maximum option to choose from in the games settings?
  • TorTor
    edited May 2010
    Avistew wrote: »
    But wouldn't there be a maximum option to choose from in the games settings?
    Depends on the game; some games certainly have a maximum resolution. That's bad design practice though--at least for 3D games, there are to technical reason why you would have to impose a maximum.

    Most 3D games will detect which resolutions your graphics card and monitor supports, and present those as your options. Many decade old 3D games will work just fine on modern screen resolutions that didn't exist at the time.

    Right now it's possible to play at 7680x4800 (using nine 2560x1600 monitors in a 3x3 configuration) if you've got serious money to burn. That capability would be pretty useless if past games were restricted to only the screen resolutions that were available at the time they were created. (Well, you could argue that it's still a useless feature, but that's beside the point :p)
  • edited May 2010
    Hey, when I started playing episode 301, it was slow as hell on level 4. But the after the second playthrough, it can go up to level 6 smoothly. Anyone know why?
  • edited May 2010
    302 is apparently more optimized than 301, but the darn film filter still slows things down.
  • edited May 2010
    Randulf wrote: »
    302 is apparently more optimized than 301, but the darn film filter still slows things down.

    I seriously doubt it's the grain filter. It's a very simple post processing effect.
  • edited May 2010
    I seriously doubt it's the grain filter. It's a very simple post processing effect.
    I too thought it is a very simple effect that should not take much GPU, but really, whenever the projector is turned on, things start to skip.
  • VainamoinenVainamoinen Moderator
    edited May 2010
    Ben wrote: »
    Besides the graphical improvements, we have some new animation tech that allows for more expressive faces.

    The facial animation has REALLY improved. Skunkape and Momma Bosco are just brilliant with it. Unfortunately, Sam & Max themselves, with their complete lack of iris and eyelids, don't really offer great opportunities to use the new animation technology. Nonetheless, I love it!
  • edited May 2010
    Randulf wrote: »
    I too thought it is a very simple effect that should not take much GPU, but really, whenever the projector is turned on, things start to skip.

    don't see how it's related to the grain filter - it's a full screen post processing effect, and should consume the same resources no matter what's on the screen.
  • edited May 2010
    Unfortunately, Sam & Max themselves, with their complete lack of iris and eyelids, don't really offer great opportunities to use the new animation technology. Nonetheless, I love it!

    I think you are understimating their expresivity. (Ok, I have troubles with them and I often have to make appear imaginary eye brows for them, but still, they are as expresive as any character, you just have to use your imagination for them!)
Sign in to comment in this discussion.