Showing posts with label graphics. Show all posts
Showing posts with label graphics. Show all posts

Friday, April 26, 2013

Real Time Graphics Asg12

Check out my shadows, yo.  This assignment was easier than I was expecting.  The thing I had the most trouble with, even after questioning it in class and discussing it with Chris, was how the fragment from the shadow map could possibly correlate to the projected view.  It wasn't until I wrote the shader and saw what values we were actually comparing that I realized how it worked.

Anyway, here's my shadow map. 
Other problems I ran into mainly consisted of setting the directional light up properly.  Turns out that if I had just followed the writeup from the beginning I wouldn't have had a problem.

Here's the source code.  Sorry, I don't have more to say, it's 1am and this is the last thing I have to do to finish the semester.  It was brutal, but fantastic, learned a ton, thanks so much!



Thursday, April 11, 2013

Real Time Graphics Asg11

This was a cool assignment.  It was nice to have an easy one after the ridiculous month I just finished.

The only real problem I encountered was copying and pasting handle code around that set things using the wrong constant tables and what not.  This actually led to a cool picture in picture effect I hadn't planned on.  I ended up setting the GUI texture as the already rendered back buffer texture.  I think at one point I even had it stretched over the whole screen so I was rendering everything to a texture, then copying that over pointlessly for some reason.

Other than that the assignment was pretty straight forward.  When you're done with everything, save it to a texture, then draw stuff onto that texture then display it.  Now that I think about it though, it seems like we should be able to just draw things directly onto the render target rather than copying things around.

Anyway, here's what it looks like with a kiwi as a health bar and a black vignette effect.  I also changed the clear color to green so that you can actually see the vignette effect.


And here's the code.

Real Time Graphics Asg10

Check it!
You see that crazy transparency on the blue spheres as they get close to the wizards face?  Yeah, I totally did that.  You can too, the arrows move all the spheres around, but now not the floor!  As you get closer to the floor it gets pretty transparent, I went a little overboard with the effect so you can really see it happening.

This assignment was kind of a tricky one.  I had a lot of problems with the depth buffer and what to do with it.  First I trying to copy the texture back to the actual DirectX depth buffer, which triggered an invalid call error.  I was on an airplane back from Montreal while doing this so googling the problem was kind of impossible.  Eventually I realized there was no reason to actually do this, since we can just sample from the texture.  Additionally, when to set the render targets and where to set them back had me staring at a beautiful black window for some time.

Here's what my depth buffer looked like by the way, and right afterwards is the actual depth buffer.
The biggest problem I encountered was something I didn't even think was a problem with mine until right before I solved it.  Other Jason was having this weird problem where only part of his spheres were showing up and when moving the spheres they went crazy.  My spheres weren't rotating and I had never moved the meshes, only the camera, so I just assumed mine worked and spent my time trying to solve his problem.  Turns out we were updating the meshes positions after we did the depth pass, which obviously you don't want to do.  We were just calling update on every mesh first thing in the draw loop, but since we added the depth pass before that loop we didn't think to move the mesh update above it.  Luckily I figured it out right around the same time I realized it was even a problem for me.  :)

One thing I kept wondering while working on this assignment was how when implementing the wobble technique we were told that games only ever do that effect once.  I figured this was because writing the entire screen to a temp buffer, then copying that over and manipulating it to make an object look like its effecting the background is really costly.  But sort of do that a second time in this assignment with the depth buffer (although we never copy it back) and we definitely do it again in the next assignment with the GUI stuff.

Anyway, here's the code.




Wednesday, March 20, 2013

Real Time Graphics Asg09

It's so shiny!!

Seriously, look at how shiny that dudes face is, it looks especially creepy on the floor.  It's like you can almost see an entire galaxy in his face, or maybe just a nebula, I dunno.

Anyway, this assignment was fairly straight forward, just ran into a few bugs.  Mainly fixing the camera to work how it was described in the email broke a few things.  I believe the problem was resolved by negative a value.  I'm not entirely sure why we needed to change anything, since from what I understood we were just passing in extra data that we didn't need originally.

Other than that the environment map worked just fine.  DirectX has great support for it.  I wonder, do people invent these technologies and then DirectX includes them in the next major release or how does that stuff work?  I thought environment mapping was a fairly new thing, but we're still rocking DirectX9, which is almost 3 years old now.  Then again, the teacher mentioned Shenmue 2 doing it and that was like 2002...

Moving onto the second part of the assignment.  Making anything behind the first sphere wobble according to a tangent function.  I dunno why we like tangents so much, probably cause of the asymptote that breaks up the squiggles.  In order to do that we had to all the non transparent geometry, save that to a texture and then draw the transparent stuff using that texture as a background.  Then copy it all back and actually draw it to the screen.  Here's the screenshot showing the texture before the transparent entity in PIX.

I was having trouble figuring out where to place the draw copy it back code, because when I stuck it in my transparency if checks it never fired since none of the entities were technically marked as transparent.  This led to a nice black screen, even though I could see all the stuff being rendered properly in PIX.  Sherly helped me modify my if check so that I got everything but the floor, turns out I was now copying the texture over to the backbuffer, but was doing it twice so I overrode the floor with the second entity.  It was kinda neat to see each draw call in PIX and watch the exact moment the floor went bye bye.  Fixed that one with even more modifications to my if check.

Those were the only real problems, the actual graphics part of the assignment was pretty straightforward.  Which is odd cause they are both pretty bizarre ideas.  Reflecting something that isn't actually there has a surprisingly cool effect that is very noticeably different from just specular.  I assume you would generally want the environment map to mirror whatever is in your skybox though.  I was also blown away by how we achieve the squiggly ball affect.  Literally saving the entire frame and then drawing over it seems like a ridiculous amount of extra work to do every single frame.  That said, we are essentially doing that anything with each mesh being drawn on top of the previous one, but we're only drawing the fragments we need each time.

The teacher said we could only do it once or it'd get too slow, but I wonder if we could pull it off again considering how little we're actually doing in our scenes.

Anyway, source code!


Thursday, March 7, 2013

Real Time Graphics Asg08

This weeks assignment was pretty cool actually.  I always had an inkling of what bump mapping was from being around games all the time, but upon actually implementing a simple normal map I'm a little blown away by what you can do to a surface by just altering some normals.  Here's the assignment for the random googlers.  And here's what the assignment ended up looking like.  Guess which one is specular.


I'm not entirely sure what to include in this writeup, usually I talk about all the horrible things that went wrong, but this one was pretty straightforward.  The hardest part was properly scaling the cube in Maya to output a cube at the same size as our hand coded cubes.  Changing the Maya exporter to output bitangents and tangents was exactly the same as how we were already output the normals.  The actual fragment shader code was pretty simple too, just modifying some normals with the new values.

I did find it interesting how DirectX handles multiple textures per mesh.  I figured we'd have to explicitly call SetTexture with the specific texture name, their way is much nicer.

Other than that I spent a bunch of time making new text files and cleaning up some things.  I took the specular exponent out of the scene file finally, even though I wasn't even using it.  Additionally I decided on an expected texture type of 0 to be just a texture and 1 to be texture and normal .  This way I can just do a simple int compare in the update loop, might not be the most human readable though.  Spoilers, 2 will probably include environmental maps.

I also spent a lot of time figuring out how I broke my specular lighting awhile ago.  Apparently I was passing a zero in for light intensity, which when multiplied with my specular lighting went to zero.

Anyway, here's the code.

Thursday, February 28, 2013

Real Time Graphics Asg07

This assignment wasn't too shabby.  Once again the hardest parts seemed to be the sorting and setting up the  PIX calls properly.  I probably went about this the completely wrong way, but its been one of those weeks, or months or semesters or whatever.  Anyway, for the assignment we had to set up PIX event calls to make debugging in PIX easier.  Which is pretty awesome, but I haven't really had a reason to use it in awhile.  I suppose that's good, I'm sure I'll learn all the details when something goes horribly wrong as with most debugging tools.  Check these awesome buckets though.




The transparency stuff wasn't too bad.  Sorting the draw order so you could see all three cubes inside of each other was kind of a hassle, but not terrible difficult.  I now have somewhat gross sorting loops in more than enough places, but hey, selection sort on 9 items is probably manageable.

I think the thing that caused me the most trouble was actually a sorting loop that got copied and pasted from another incorrectly.  It was actually causing my transparency to not work at all, I think because certain settings were turned off at certain times.  Once I caught the error, a counting issue, everything just worked.  

Also, I had an issue with the world being incredibly dark, but I think that problem stemmed from some bigger issues.  I'm not sure my specular lighting was functioning exactly as it should.  Speaking of which, I moved the specular exponent to the materials finally, so be happy about that.  :)  I just realized I did not take the number out of my scene file though, so it is still being read in and parsed and not used.  I'll fix that later, everything is already zipped up and uploaded now though.  Anyway, back on track, my specular lighting wasn't quite working properly I thought, and upon looking into it I noticed I was actually passing a zero into the shader instead of my scene files value, I probably switched it out a bit ago as a test and forgot about it.  Upon fixing that, and a few other things, that I messed up this week, the scene went really dark.  Part of that was because of my ambient lighting being really small do to compensate for some of the other errors I had.

Long story short, there were several small lighting problems that when fixed turned into other problems.  I think I got it all sorted now.

Here's the code.  Sorry this writeup was so rambly and all over the place.  Hope you got the gist of it, nothing in particular actually pertaining to the assignment was terribly difficult this week.  Just a lot of little things that needed to be fixed.

Thursday, February 21, 2013

Real Time Graphics Asg06

This assignment was pretty cool, it was the first time I've ever dealt with reading and writing binary files and it's a lot simpler than I thought.  I still encountered a infuriating bug that was really hard to fix since everything was in binary, but finding it was pretty easy.  Fixing it ended up being easier than what I was trying to do in the first place anyway.  I'm still baffled by the stuff you can do with C++.  This ain't your grandmothers programming language... then again it might be old enough to be.

Anyway, onto the graphics, this assignment was more about sorting and optimization than anything.  We had to sort our entities by their shader and texture in an effort to minimize the slow swapping operations in the update loop.  As you can see by my lovely screenshot here, I set the materials together, the effects together and then all the entity stuff.  I basically just sorted the entity array after I parsed it in.

The actual fun part of the assignment was writing the Maya plugin.  Well, more like writing a method of the Maya plugin.  It was still awesome to be able to model something in Maya and export it into my customized format and be able to parse that into my own graphics system.  I have a feeling when it comes time to add fbx support to our game engines renderer next semester we'll have a lot easier time.  Nothing really bad happened with this section of the assignment, except the Maya install on my laptop apparently broke between now and December when I used it last.  I couldn't hit the browse button on the plug in manager or the export all button or most menu buttons without the program immediately becoming perpetually unresponsive.  It wasn't the plugin either, I couldn't even get that far.  I ended up just using my lab computer.

The only big problem I ran into was the one I mentioned above.  Since I'd never written or read in a binary file before I was a little confused as to how to read it back in.  I ended up trying a way Sherly showed me, which confusingly only worked some of the time.  It worked flawless for my cubes, and for some of the simpler shapes I exported from Maya, but one of my shapes only drew a single triangle and the torus read in a negative vertex count.  It's hard to instantiate arrays with a negative size.  Anyway, her method worked for her, and involved some bitwise ors to convert a char back to an int, but Cody showed me a much easier method that works 100% of the time.  I'm thinking Sherly's method didn't work from machine to machine due to big endian and little endian differences, but I'm pretty sure she tested hers on the same lab machines we were using so I'm not sure.

Anyway, it was a pretty neat assignment, here's the code.  Oh, and one thing to note, the changes I made to   MeshBuilder in order to write the files to binary no longer overwrite existing files, so if you try to change a mesh in the Assets folder, you'll currently have to delete the mesh files out of the temp/data folder.  Sorry about that, I'll find a fix for that next week, it's probably as simple as using another method out of that library that has CopyFile instead of using fstream.

Thursday, February 14, 2013

Real Time Graphics Asg05

This assignment was pretty cool.  It's nice seeing all that effort we had to put into this pipeline initially allow us to expand the scene so rapidly.  Just look at all these cubes!
Complete with specular and diffuse lighting.  The four on the left are specular, as you can see by that awesome glare on the left rubix cube cube, while the right four are only diffuse.  Also, the front four use 24 vertices with proper normals, while the back four use the mesh that is only 8 vertices and averaged normals.

The controls are the same, WASD to move the camera, IJKL for the light and the arrow keys for the meshes.

I'm not entirely sure what to talk about this time around, specular lighting is pretty awesome, but the math behind it is pretty simple.  It made for an easy assignment that consisted of maybe 30 minutes of graphics related work and maybe 4 hours of creating and manipulating text files and then debugging said text files.  I ended up using the message logging system a ton to figure out where all my errors were though, so I'm glad I kept up on our teachers policy of logging errors everywhere.

One issue that kind of confused me is the specular lighting doesn't appear to be working on the back left two cubes that use the averaged normals.  Supposedly it should work better,and it's hard to tell because they are so far back, but I don't think it's working.  You can rearrange them pretty easily with the text files if you are interested.  I don't get how the lighting can work for the other meshes, but not for all of them.

Additionally, I set up my code so that the specular lighting exponent could be read in from the scene file before noticing I was supposed to put it each individual material file.  This makes sense so we can specify how rough or smooth and shiny various materials are, but I don't have the sheer force of will to go back and change all the code and add the getters and setters to make that work right now.  I'll probably switch that up next week or when we do some bump mapping (we are doing bump mapping eventually, right?)

Oh, one last question I forgot to bring up in class, but I found interesting.  I chose to pass the world to view transform into my specular fragment shader cause it was easy, but it seemed like it'd be faster and more efficient to output it from the vertex shader and intercept it with the fragment shader.  Then you'd have a vertex shader outputting information that not every fragment shader needs however.  Is their a better choice for handling passing this data?

Here's the source code!

Sunday, February 10, 2013

Real Time Graphics Asg04

So this assignment ended up being way harder than I thought.  Here it is.  This time we had to update our content pipeline so that we could generate all the relevant graphics data from text files.  Now with minor changes to some text I can add dozens upon dozens of cubes in of various colors and textures with colored light and intensity.  It's pretty much the greatest graphics content pipeline ever, objectively speaking.

Seriously though, the assignment had very little to do with actual graphics stuff, but ended up being kind of interesting regardless.  It was a lot of string parsing, which is the devil, and took forever, but that stuff is pretty boring and not too challenging so I don't really feel the need to talk about it.  It did take me like ten hours for that chunk alone though.

The reason it was interesting was the remaining 5 hours, you know, those 5 hours that you expect to take about 1, but it never really works out that way.  Once I got all the parsing stuff in, figuring out how to modify the current renderer class that our teacher built to display multiple sets of vertex data was quite the ordeal.  I ended up wrapping most of the LoadScene, ReleaseScene and Update methods in loops that basically did the same thing for each cube I had to draw.  The whole process didn't seem quite right, but by this point in the project I was somewhat out of steam.  I'll probably look into rearranging some things and optimizing in later assignments.  In fact, I think that's going to be an upcoming requirement of a new assignment.  :)

I also spent a lot of time trying to figure out the best way to pass data up from the parsers and into the eventual DirectX calls.  The other Jason and I had a pretty big debate about some aspects of it.  Mainly because he wanted to copy all the vertex and index data into the buffers we were already using and not change the DirectX calls.  I figured we'd save space and time by just passing the buffers we populated during the parsers directly.  After about 3 hours of trying to make it work my way, I gave up and did it his way.  I think I got really close, cause I found the biggest culprit of the errors I was getting was a typo in my mesh parsing code.  Still, at some point you just got to cut your losses.  I had other stuff to work on and I did meet all the requirements.

Anyway, despite this assignment not having much to do with the high level graphics stuff we always discuss in class, it forced me to be way more familiar with what all those DirectX calls were actually doing since now a lot of them are done three times.  Not all though, turns out clearing the screen 3 times per Update call is a bad idea.

Here's my PIX depth screenshot by the way.  I think this part of the assignment took about 7 minutes, but its definitely noticeable.
And here's my source code.

Thursday, January 31, 2013

Real Time Graphics Asg03

Now that I actually understand the graphics pipeline a bit better, this assignment was quite a bit easier.  I understood immediately what the light class would look like and understood how everything was going to fit together.  It was just a matter of figuring out the right method calls and changing the Mesh file into something more readable, you're welcome, I think string parsing in C++ actually physically hurts me.  ;)  Anyways, here's the assignment for the randoms reading this.  Basically this week we had to texture our cube and apply lighting.

The controls are the same as before, WASD controls the camera, the Arrow keys control the cube, and IJKL controls the light source.

The troubles I ran into this week were all kind of trivial.  First of all, I don't know if this was brought up in class, but the UV coordinates originate from the top left.  It seems like each individual aspect of the DirectX graphics pipeline uses a different origin..

The second issue was I assumed the D3DCOLOR object in the s_vertex was size 12 because its three ints, but I guess I was wrong.  That caused my texture to be drawn horribly wrong because I used the wrong offset into the struct.  That took awhile to fix, really Sherly figured it out.

Another aggravating issue is I used the word texture in my fragment shader, which must be a keyword or something.  I should probably get a syntax highlighting plugin.  The error just said something like 'unexpected token texture'.  I just changed it on a whim to texture_sample and it worked.

Finally, I couldn't tell if my light was actually moving at first because moving it at the same speed as the camera and the cube.  I'm guessing changing it by 1 has such a minuscule effect on the angle hitting the cube that you have to move it really far to notice.  I changed the controls to offset by 10 at a time and it works much nicer.

Here's my required screenshot.
And here's my code!

P.S. Jason and I worked together so that's why we wrote about the same issues.


Thursday, January 24, 2013

Real Time Graphics Asg 02

Woo, check out my wonderful trippy as balls cube.  That took a whole lot of work.

You can control the camera with WASD and the cube with the arrow keys, it works just like the instructors and it spins constantly.  Most of the code ended up in cRenderer, it's kinda gross in terms of organization at the moment, but until I really see what's going on and where we are going with this class I'm gonna leave it as is.

So this assignment was pretty dang hard.  I understood all the concepts of the various view spaces discussed in class, and understood what we were supposed to implement, which I'll go over here to humor John Paul.

Basically we created a cube in model space, converted it to world space in the shader, converted that to view space and finally to projected space and sent that to the fragment shader.  It made plenty of sense and I've had a lot of 3D math experience, but actually writing the code was a whole other ordeal.  I relied very heavily on the video of the lecture since a lot of sample code could be gleaned from there.  I had the most trouble figuring out how various components communicated with each other though.

For instance assembling the index buffer was easy, I drew a diagram, counted the vertices and set it all up using the already written vertex buffer code.  It's almost identical, you just have to change a lot of method calls to the index equivalent.  Making the two buffers play nice was another matter that actually ended up being a simple as changing the draw method, but it wasn't immediately obvious.

The hardest part was figuring out how to make the camera communicate with everything.  Setting up a class was easy, but it ended up just sitting there until a classmate showed me how to integrate it with the world to view transform, which in retrospect makes total sense.  View to projected on the other hand just takes weird unintuitive parameters.

Once everything was built, I still had a lot of debugging I had to do.  When you copy and paste code, things don't always get changed as they should.  The most frustrating part was seeing a perfect cube drawn in the PIX Mesh tab like so, but seeing nothing in my fun little window.
Turns out I set up my camera's translate function to equal an offset rather than plus equal an offset so it was perpetually trapped inside the cube where nothing was rendered.  I had to set some culling mode that a classmate showed me so that I could at least see that things were being drawn before realizing the camera translate bug.

This was an awesome assignment though, learned a ton about the graphics pipeline and I can't wait to try out this lighting business.  After all that's why they're called shaders, I assume.  Here's the source code.

Thursday, January 17, 2013

Real Time Graphics Asg01

So I found this first graphics assignment very interesting seeing as I'm completely new to shaders.  I ran into several interesting problems so lets just start from the beginning.  For random people who stumble upon this blog, here's the assignment assuming it stays up awhile.  For the TL;DR crowd, we basically had to take our teachers code base that draws one triangle and runs it through a fragment and vertex shader and change it to a rectangle and modify the shaders.

First, after looking through the source code our teacher provided, I figured out how the various asset builders worked.  Since the MeshBuilder was explicitly stated to just copy a file to a new location I just took the TextureBuilder code and pasted into a MeshBuilder.  And lo and behold, I got compiler error.  Those are the best kind of errors, aren't they?  The CopyFile method call complaining that it couldn't take a character array and after a lot of tinkering I found an option to change the character type for a Visual Studio project.

The next fun problem was that the MeshBuilder was taking the file with the vertex coordinates and copying it to a path that was relative to the application built in the Debug folder, so when running the code from within Visual Studio instead of just clicking on the application didn't work.  To remedy this I put a dummy folder and vertex file in the same folder as the project solution.  It never actually gets read, but it gets rid of some build errors.

The way I formatted my vertex input file was pretty much the laziest way possible.  I had grand plans, including a triangle count and a triangle label before each set of three vertices.  Upon remembering how terrible string parsing is in C++ I quickly settled on vertices delimited by new lines and every triangle spaced by an extra line.  This surely change in the future, probably next week, but its a good start.

The rest of the fun problems mainly consisted of learning DirectX rules, such as winding order and finding where our teacher hard coded the draw primitive count to 1.  It initially also appeared that I had to draw my triangles from the top left most vertex first, but I eventually figured out that clockwise was the only requirement and that can be reversed with a setting somewhere.

When I tried adjusting the shaders I had trouble figuring out which variations did what.  I tried adjusting the fragment shader that simply passed the color to the vertex shader.  I halved the color first and then set it equal to the output color, but did not see any change to the rectangle.  In the vertex shader I changed the color and position based on elapsed time, similar to how the teacher did it, but I passed the time into tangent functions so the square would teleport from one corner of the screen to the other at the end of each pass.  Also, I got a fun little color splash that happened twice during the slower part of the tangent curve.  I tried adjusting the alpha values as a function of tangent of time as well, but found out alpha is turned off by default.  You can see the values changing in the PIX debugger, but it had no effect.

Finally, we had to take a few screenshots showing us debugging in PIX, so here they are.  For some reason when PIX ran my executable it only drew one of the triangles.  It read the second one, but set its vertices to 0 for some reason.  I never did figure out why.  UPDATE:  I figured out it was cause of a buffer overflow problem that I've fixed.  Didn't change the screenshots though. so if you can make out those 0,0 vertices that's why.



And here's my source code.