Thursday, January 31, 2013

Real Time Graphics Asg03

Now that I actually understand the graphics pipeline a bit better, this assignment was quite a bit easier.  I understood immediately what the light class would look like and understood how everything was going to fit together.  It was just a matter of figuring out the right method calls and changing the Mesh file into something more readable, you're welcome, I think string parsing in C++ actually physically hurts me.  ;)  Anyways, here's the assignment for the randoms reading this.  Basically this week we had to texture our cube and apply lighting.

The controls are the same as before, WASD controls the camera, the Arrow keys control the cube, and IJKL controls the light source.

The troubles I ran into this week were all kind of trivial.  First of all, I don't know if this was brought up in class, but the UV coordinates originate from the top left.  It seems like each individual aspect of the DirectX graphics pipeline uses a different origin..

The second issue was I assumed the D3DCOLOR object in the s_vertex was size 12 because its three ints, but I guess I was wrong.  That caused my texture to be drawn horribly wrong because I used the wrong offset into the struct.  That took awhile to fix, really Sherly figured it out.

Another aggravating issue is I used the word texture in my fragment shader, which must be a keyword or something.  I should probably get a syntax highlighting plugin.  The error just said something like 'unexpected token texture'.  I just changed it on a whim to texture_sample and it worked.

Finally, I couldn't tell if my light was actually moving at first because moving it at the same speed as the camera and the cube.  I'm guessing changing it by 1 has such a minuscule effect on the angle hitting the cube that you have to move it really far to notice.  I changed the controls to offset by 10 at a time and it works much nicer.

Here's my required screenshot.
And here's my code!

P.S. Jason and I worked together so that's why we wrote about the same issues.


Tuesday, January 29, 2013

Vinyl Tech Stuff


So the only substantial thing that happened last week was technological stuff.  The cohort of students ahead of us in the program made a forced runner thesis game in Unity and I asked them some questions about how they did it.  Turns out they used this pretty fantastic library called iTween that does a lot of animation type stuff and even music manipulation.  I spent awhile this week learning my way around the framework and got a cube spinning around a record.  It sounds like we'll be changing the design idea around a bit this week though, so I didn't go too much farther.  iTween will work just as well for the new idea however and hopefully we can get started tomorrow or Thursday.

I'll discuss the details of the changes in a later post and maybe some screenshots of what we got.

Thursday, January 24, 2013

Real Time Graphics Asg 02

Woo, check out my wonderful trippy as balls cube.  That took a whole lot of work.

You can control the camera with WASD and the cube with the arrow keys, it works just like the instructors and it spins constantly.  Most of the code ended up in cRenderer, it's kinda gross in terms of organization at the moment, but until I really see what's going on and where we are going with this class I'm gonna leave it as is.

So this assignment was pretty dang hard.  I understood all the concepts of the various view spaces discussed in class, and understood what we were supposed to implement, which I'll go over here to humor John Paul.

Basically we created a cube in model space, converted it to world space in the shader, converted that to view space and finally to projected space and sent that to the fragment shader.  It made plenty of sense and I've had a lot of 3D math experience, but actually writing the code was a whole other ordeal.  I relied very heavily on the video of the lecture since a lot of sample code could be gleaned from there.  I had the most trouble figuring out how various components communicated with each other though.

For instance assembling the index buffer was easy, I drew a diagram, counted the vertices and set it all up using the already written vertex buffer code.  It's almost identical, you just have to change a lot of method calls to the index equivalent.  Making the two buffers play nice was another matter that actually ended up being a simple as changing the draw method, but it wasn't immediately obvious.

The hardest part was figuring out how to make the camera communicate with everything.  Setting up a class was easy, but it ended up just sitting there until a classmate showed me how to integrate it with the world to view transform, which in retrospect makes total sense.  View to projected on the other hand just takes weird unintuitive parameters.

Once everything was built, I still had a lot of debugging I had to do.  When you copy and paste code, things don't always get changed as they should.  The most frustrating part was seeing a perfect cube drawn in the PIX Mesh tab like so, but seeing nothing in my fun little window.
Turns out I set up my camera's translate function to equal an offset rather than plus equal an offset so it was perpetually trapped inside the cube where nothing was rendered.  I had to set some culling mode that a classmate showed me so that I could at least see that things were being drawn before realizing the camera translate bug.

This was an awesome assignment though, learned a ton about the graphics pipeline and I can't wait to try out this lighting business.  After all that's why they're called shaders, I assume.  Here's the source code.

Monday, January 21, 2013

Vinyl is a go!

So the professors revealed last week that they won't be choosing our games, but we will.  There were a few stipulations, but basically the teams must be a minimum size of 5 and max of 12.  After that, pretty much anything goes.

I was incredibly impressed by the sheer number of students in our program pitching and the overall quality of most of them.  As you can gather from this title, my game was one of the few chosen to be made, but during the pitches I kept thinking I'd be happy to work on almost any of the games pitched.  Nothing is official until tomorrow, but it looks like the three games will be prototyped over the next few weeks.

Cellblock is an asynchronous game about a hacker trapped in a prison cell being rescued by a solider.  The soldier player will play through an FPS style game, working his way toward the hacker, while the hacker helps him fight through the prison remotely by disabling security cameras and what not.

The next game has no name yet, but focuses on physics based combat and/or platforming.  I helped work on a prototype for the basis of this game last semester and I'm really excited it got picked.  Cody, the lead on the project further refined the design and pitched a very compelling idea that could become a a fun combat game or perhaps a platformer, or maybe even some combination of the two.

Vinyl, was my game which I described last week.  I can't wait to get started on the prototype, I've always wanted to make a music game and this will be a great opportunity with a very talented group of engineers.  The current plan is to prototype in Unity, and then decide on which engine we want to really build it with sometime in February.  Unity would be ideal for deploying to multiple platforms, but I'm not a huge fan of the lack of flexibility the engine offers.  It's great to get something up and running fast, it seems pretty terrible to get anything very specific working at all.  We will see I suppose.

Thursday, January 17, 2013

Real Time Graphics Asg01

So I found this first graphics assignment very interesting seeing as I'm completely new to shaders.  I ran into several interesting problems so lets just start from the beginning.  For random people who stumble upon this blog, here's the assignment assuming it stays up awhile.  For the TL;DR crowd, we basically had to take our teachers code base that draws one triangle and runs it through a fragment and vertex shader and change it to a rectangle and modify the shaders.

First, after looking through the source code our teacher provided, I figured out how the various asset builders worked.  Since the MeshBuilder was explicitly stated to just copy a file to a new location I just took the TextureBuilder code and pasted into a MeshBuilder.  And lo and behold, I got compiler error.  Those are the best kind of errors, aren't they?  The CopyFile method call complaining that it couldn't take a character array and after a lot of tinkering I found an option to change the character type for a Visual Studio project.

The next fun problem was that the MeshBuilder was taking the file with the vertex coordinates and copying it to a path that was relative to the application built in the Debug folder, so when running the code from within Visual Studio instead of just clicking on the application didn't work.  To remedy this I put a dummy folder and vertex file in the same folder as the project solution.  It never actually gets read, but it gets rid of some build errors.

The way I formatted my vertex input file was pretty much the laziest way possible.  I had grand plans, including a triangle count and a triangle label before each set of three vertices.  Upon remembering how terrible string parsing is in C++ I quickly settled on vertices delimited by new lines and every triangle spaced by an extra line.  This surely change in the future, probably next week, but its a good start.

The rest of the fun problems mainly consisted of learning DirectX rules, such as winding order and finding where our teacher hard coded the draw primitive count to 1.  It initially also appeared that I had to draw my triangles from the top left most vertex first, but I eventually figured out that clockwise was the only requirement and that can be reversed with a setting somewhere.

When I tried adjusting the shaders I had trouble figuring out which variations did what.  I tried adjusting the fragment shader that simply passed the color to the vertex shader.  I halved the color first and then set it equal to the output color, but did not see any change to the rectangle.  In the vertex shader I changed the color and position based on elapsed time, similar to how the teacher did it, but I passed the time into tangent functions so the square would teleport from one corner of the screen to the other at the end of each pass.  Also, I got a fun little color splash that happened twice during the slower part of the tangent curve.  I tried adjusting the alpha values as a function of tangent of time as well, but found out alpha is turned off by default.  You can see the values changing in the PIX debugger, but it had no effect.

Finally, we had to take a few screenshots showing us debugging in PIX, so here they are.  For some reason when PIX ran my executable it only drew one of the triangles.  It read the second one, but set its vertices to 0 for some reason.  I never did figure out why.  UPDATE:  I figured out it was cause of a buffer overflow problem that I've fixed.  Didn't change the screenshots though. so if you can make out those 0,0 vertices that's why.



And here's my source code.

Sunday, January 13, 2013

Semester 2 Start!

So I'm back after what was the least relaxing break I think I've ever had.  Spent most of it at the new job or working on our Ubisoft prototype for this competition.  I guess it was worth it though, ours was picked so we get to compete in Montreal come April.

Anyway, this post is going to be more about ideas than whining about my lack of break.  This semester starts off with game pitch ideas.  Everyone has to pitch, or at least be involved in a pitch.  The top several get selected to be rapidly prototyped, and the top two or three of those become our thesis projects for the rest of our program.

There was a lot of talk about synesthesia, or however you spell it, and how they are potentially good games for a limited number of artists, which we have.  That resonated with me, I've always wanted to make a music game kind of like Rez so I got to thinking about potential ideas for that.

Not much came to mind at first, but as always, after giving up and trying to sleep a couple days later I thought of taking the grinding mechanic out of Sonic Adventure 2, and turning the rails into strings.  It seemed like a fun idea to jump between strings to dodge obstacles and simultaneously create musical notes.  The game would be almost like a forced runner and if played properly would result in a cool song.  My partner (Mike) and I liked the idea, but we knew it needed more so we kept thinking about it.

After a bit more thought Mike came up with the idea of replacing the strings with a vinyl record and grinding the ridge like the needle of a record player.  This led to a myriad of new ideas and even allowed us to incorporate the old ideas into a sort of "star power" meta game.  This solved the problem of being a gimmick, and lends itself nicely to a lot more musical tricks, such as balancing on the rail effecting the stereo sound.

I'll post some exerts from the full design doc once it's ready and comment on the classes reception.