Difference between revisions of "GPipe"
m (Requires version 1.2) |
(use vec 0.9.6) |
||
Line 12: | Line 12: | ||
This page is formatted as a literate Haskell page, simply save it as "<tt>box.lhs</tt>" and then type |
This page is formatted as a literate Haskell page, simply save it as "<tt>box.lhs</tt>" and then type |
||
<pre> |
<pre> |
||
− | ghc --make –O box.lhs |
+ | ghc --make –O box.lhs -package Vec-0.9.6 |
box |
box |
||
</pre> |
</pre> |
||
− | at the prompt to see a spinning box. You’ll also need an image named "<tt>myPicture.jpg</tt>" in the same directory (I used a picture of some wooden planks). |
+ | at the prompt to see a spinning box. Future versions of GPipe will use the more recent version of Vec, but for now you'll have to specify it to use the older 0.9.6 version (if only cabal were a little smarter...). You’ll also need an image named "<tt>myPicture.jpg</tt>" in the same directory (I used a picture of some wooden planks). |
<haskell> |
<haskell> |
Revision as of 18:19, 25 March 2010
What is GPipe?
GPipe is a library for programming the GPU (graphics processing unit). It is an alternative to using OpenGl, and has the advantage that it is purely functional, statically typed and operates on immutable data as opposed to OpenGl's inherently imperative style. Another important difference with OpenGl is that with GPipe you don't need to write shaders in a second shader language such as GLSL or Cg, but instead use regular Haskell functions on the GPU data types. GPipe uses the same conceptual model as OpenGl, and it is recommended that you have at least a basic understanding of how OpenGl works to be able to use GPipe.
In GPipe, you'll primary work with these four types of data on the GPU:
Let's walk our way through an simple example as I explain how you work with these types. This example requires GPipe version 1.2 or later. This page is formatted as a literate Haskell page, simply save it as "box.lhs" and then type
ghc --make –O box.lhs -package Vec-0.9.6 box
at the prompt to see a spinning box. Future versions of GPipe will use the more recent version of Vec, but for now you'll have to specify it to use the older 0.9.6 version (if only cabal were a little smarter...). You’ll also need an image named "myPicture.jpg" in the same directory (I used a picture of some wooden planks).
> module Main where
> import Graphics.GPipe
> import Graphics.GPipe.Texture.Load
> import qualified Data.Vec as Vec
> import Data.Vec.Nat
> import Data.Vec.LinAlg.Transform3D
> import Data.Monoid
> import Data.IORef
> import Graphics.UI.GLUT
> (Window,
> mainLoop,
> postRedisplay,
> idleCallback,
> getArgsAndInitialize,
> ($=))
Besides GPipe, this example also uses the Vec-Transform package for the transformation matrices, and the GPipe-TextureLoad package for loading textures from disc. GLUT is used in GPipe for window management and the main loop.
Creating a window
We start by defining the main
function.
> main :: IO ()
> main = do
> getArgsAndInitialize
> tex <- loadTexture RGB8 "myPicture.jpg"
> angleRef <- newIORef 0.0
> newWindow "Spinning box" (100:.100:.()) (800:.600:.()) (renderFrame tex angleRef) initWindow
> mainLoop
> renderFrame :: Texture2D RGBFormat -> IORef Float -> Vec2 Int -> IO (FrameBuffer RGBFormat () ())
> renderFrame tex angleRef size = do
> angle <- readIORef angleRef
> writeIORef angleRef ((angle + 0.005) `mod'` (2*pi))
> return $ cubeFrameBuffer tex angle size
> initWindow :: Window -> IO ()
> initWindow win = idleCallback $= Just (postRedisplay (Just win))
First we set up GLUT, and load a texture from disc via the GPipe-TextureLoad package function loadTexture
. In this example we're going to animate a spinning box, and for that we put an angle in an IORef
so that we can update it between frames. We then create a window with newWindow
initWindow
registers this window as being continously redisplayed in the idle loop. At each frame, the IO
action renderFrame tex angleRef size
is run. In this function the angle is incremented with 0.005 (reseted each lap), and a FrameBuffer
is created and returned to be displayed in the window. But before I explain FrameBuffer
s, let's jump to the start of the graphics pipeline instead.
PrimitiveStreams
The graphics pipeline starts with creating primitives such as triangles on the GPU.Let's create a box with six sides, each made up of two triangles each.
> cube :: PrimitiveStream Triangle (Vec3 (Vertex Float), Vec3 (Vertex Float), Vec2 (Vertex Float))
> cube = mconcat [sidePosX, sideNegX, sidePosY, sideNegY, sidePosZ, sideNegZ]
> sidePosX = toGPUStream TriangleStrip $ zip3 [1:.0:.0:.(), 1:.1:.0:.(), 1:.0:.1:.(), 1:.1:.1:.()] (repeat (1:.0:.0:.())) uvCoords
> sideNegX = toGPUStream TriangleStrip $ zip3 [0:.0:.1:.(), 0:.1:.1:.(), 0:.0:.0:.(), 0:.1:.0:.()] (repeat ((-1):.0:.0:.())) uvCoords
> sidePosY = toGPUStream TriangleStrip $ zip3 [0:.1:.1:.(), 1:.1:.1:.(), 0:.1:.0:.(), 1:.1:.0:.()] (repeat (0:.1:.0:.())) uvCoords
> sideNegY = toGPUStream TriangleStrip $ zip3 [0:.0:.0:.(), 1:.0:.0:.(), 0:.0:.1:.(), 1:.0:.1:.()] (repeat (0:.(-1):.0:.())) uvCoords
> sidePosZ = toGPUStream TriangleStrip $ zip3 [1:.0:.1:.(), 1:.1:.1:.(), 0:.0:.1:.(), 0:.1:.1:.()] (repeat (0:.0:.1:.())) uvCoords
> sideNegZ = toGPUStream TriangleStrip $ zip3 [0:.0:.0:.(), 0:.1:.0:.(), 1:.0:.0:.(), 1:.1:.0:.()] (repeat (0:.0:.(-1):.())) uvCoords
> uvCoords = [0:.0:.(), 0:.1:.(), 1:.0:.(), 1:.1:.()]
Every side of the box is created from a regular list of four elements each, where each element is a tuple with three vectors: a position, a normal and an uv-coordinate. These lists of vertices are then turned into PrimitiveStream
toGPUStream
All six sides are then concatenated together into a cube. We can see that the type of the cube is a PrimitiveStream
of Triangle
Vertex Float
s instead of Float
s since they are now on the GPU.
The cube is defined in model-space, i.e where positions and normals are relative the cube. We now want to rotate that cube using a variable angle and project the whole thing with a perspective projection, as it is seen through a camera 2 units down the z-axis.
> transformedCube :: Float -> Vec2 Int -> PrimitiveStream Triangle (Vec4 (Vertex Float), (Vec3 (Vertex Float), Vec2 (Vertex Float)))
> transformedCube angle size = fmap (transform angle size) cube
> transform angle (width:.height:.()) (pos, norm, uv) = (transformedPos, (transformedNorm, uv))
> where
> modelMat = rotationVec (normalize (1:.0.5:.0.3:.())) angle `multmm` translation (-0.5)
> viewMat = translation (-(0:.0:.2:.()))
> projMat = perspective 1 100 (pi/3) (fromIntegral width / fromIntegral height)
> viewProjMat = projMat `multmm` viewMat
> transformedPos = toGPU (viewProjMat `multmm` modelMat) `multmv` homPoint pos
> transformedNorm = toGPU (Vec.map (Vec.take n3) $ Vec.take n3 $ modelMat) `multmv` norm
When applying a function on the PrimitiveStream
using fmap
, that function will be executed on the GPU using vertex shaders. The toGPU
Float
s into GPU-values like Vertex Float
so it can be used with the vertices of the PrimitiveStream
.
FragmentStreams
To render the primitives on the screen, we must first turn them into pixel fragments. This called rasterization and in our example done by the function rasterizeFront
PrimitiveStream
s into FragmentStream
> rasterizedCube :: Float -> Vec2 Int -> FragmentStream (Vec3 (Fragment Float), Vec2 (Fragment Float))
> rasterizedCube angle size = rasterizeFront $ transformedCube angle size
In the rasterization process, values of type Vertex Float
are turned into values of type Fragment Float
.
For each fragment, we now want to give it a color from the texture we initially loaded, as well as light it with a directional light coming from the camera.
> litCube :: Texture2D RGBFormat -> Float -> Vec2 Int -> FragmentStream (Color RGBFormat (Fragment Float))
> litCube tex angle size = fmap (enlight tex) $ rasterizedCube angle size
> enlight tex (norm, uv) = RGB (c * Vec.vec (norm `dot` toGPU (0:.0:.1:.())))
> where RGB c = sample (Sampler Linear Wrap) tex uv
Using fmap
on a FragmentStream
will execute a function on the GPU using fragment shaders. The function sample
Once we have a FragmentStream
of Color
s, we can paint those fragments onto a FrameBuffer
.
FrameBuffers
A FrameBuffer
FragmentStream
s are painted. A FrameBuffer
may contain any combination of a color buffer, a depth buffer and a stencil buffer. Besides being shown in windows, FrameBuffer
s may also be saved to memory or converted to textures, thus enabling multi pass rendering. A FrameBuffer
has no defined size, but take the size of the window when shown, or are given a size when saved to memory or converted to a texture.
And so finally, we paint the fragments we have created onto a black FrameBuffer
. By this we use paintColor
> cubeFrameBuffer :: Texture2D RGBFormat -> Float -> Vec2 Int -> FrameBuffer RGBFormat () ()
> cubeFrameBuffer tex angle size = paintSolid (litCube tex angle size) emptyFrameBuffer
> paintSolid = paintColor NoBlending (RGB $ Vec.vec True)
> emptyFrameBuffer = newFrameBufferColor (RGB 0)
This FrameBuffer
is the one we return from the renderFrame
action we defined at the top.
Screenshot
Questions and feedback
If you have any questions or suggestions, feel free to mail me. I'm also interested in seeing some use cases from the community, as complex or trivial they may be.