Our graphics
capabilities have sort of plateaued as of late. Games and movies look better
than they ever have, but we’re still embroiled in an eternal battle with the
Uncanny Valley. No matter how great a video game heroine’s hair looks, or how
many individual furs are articulated on an anthropomorphic Pixar character, we
can still instantly tell that computer-generated graphics are just that —
generated by a computer. Aside from human eyes and mouths, one of the biggest
enemies of the Uncanny Valley is computer-generated water. With PhysX’s
position based fluids, though, CG water flows the best we’ve ever seen.
Perhaps the closest
we’ve come to mind-blowing computer-generated water effects in a consumer
product is the water in the Uncharted series. The way the water
flows won’t explode any brains, but how it dampens anything it touches, exactly
where it touches, is nothing short of mesmerizing. For instance, whatever angle
a character enters a body of water, his or her shirt and pants will become damp
based on which parts of the clothing actually touched the water. While that
realism is impressive, it certainly doesn’t translate to the way the water
flows. However, PhysX’s position based fluids research seems to have produced computer-generated water that moves just like the
real thing.
CG water
that behaves similarly to real fluid has been around for a while — most notably
in tech demos showing off that it’s possible, or perhaps appearing in the
occasional big-budget movie. However, it’s too computationally intensive to put
into a real-time application, like a video game. For as pretty as the CryEngine
is, our hardware just can’t dedicate enough resources to generate fancy flowing
fluids. Now, though, Nvidia’s Miles Macklin and Matthias Müller-Fischer have
figured out a way to reduce the load on hardware, generating a result that is
remarkably fluid, but “suitable for real-time applications.”
Position
based fluids — the method used — is similar to the one that dictates the
behavior of computer-generated cloth, position based dynamics. Unfortunately,
the exact methods used to reduce the computational load and create lifelike
water movements haven’t yet been revealed, as Macklin and Müller-Fischer are
saving the details for a forthcoming research paper. What the pair did note,
though, is that they were able to create surface tension, improve particle
distribution, and lower the overall computational requirements to get
everything working.
The
“real-time applications” weren’t defined, and we’re not sure if that means
consumer-grade entertainment media, or expensive and powerful systems used by
professional designers. The above video is a mightily impressive tech demo
either way. The water bounces and flows like it would in real life, shimmying between
cracks, rolling off of curved surfaces, and adhering to inertia. Almost
benevolently, the video morphs the water into little spheres, giving us
something of an X-ray view of how the water moves.
Hopefully, the
PhysX duo will be able to transfer the position based fluids method to next-gen consumer-grade applications, and we’ll soon be able
to watch water soak a character’s shirt, but realistically bounce around
between his arms and off of his torso during the process. [Source]
You can
follow me on Twitter, add me to your circles on Google+
or Subscribe to me on facebook
or YouTube.
You can also check my website
and blog
to keep yourself updated with
what is happening in the ever changing world of technology
No comments:
Post a Comment