Saturday, December 14, 2013

Vinyl at the EAE Open House


The EAE program had an open house this week to show off all the student projects we've been working on for the last year.  We typically get lots of friends and family, local industry professionals and faculty to come check out everything.  Since we were the second year grad students, we got to set up our projects in a much bigger area.  As it turns out, bright flashy colors and music on an 80 inch TV draws attention, so we were able to get tons of feed back.  We also set up a bunch of computers with feedback forms on the alternate monitors and got almost unanimously positive feedback from that.

We still get the occasional complaint that we mess up the music when we're trying to enhance it, or that the game favors the right side too much.  Both are valid, but I'm unsure if we'll be able to do anything about them if we want to keep any semblance of our original premise alive.  Part of me wants to try making the track either straight or curvy in random directions, but we shall see next semester.

The game is basically in a state we are currently very happy with and polish and bug fixing will be our main focus for here on out.

Aside from Vinyl, there were tons of other fantastic student projects at the show.  The undergrad PSP games are using the touch screen in unique ways, and the research teams have made several games on topics ranging from fitness to teaching social skills to autistic teens.  The first year grad students all had their rapid prototypes on display, and will be publishing them to the Windows app store before the semester is out.

All in all, I'd say it was a pretty successful evening.  And the cake was delicious.



Thursday, November 28, 2013

Vinyl: Score Design

After much deliberation, we finally decided that Vinyl needed some kind of scoring system.  We resisted this idea for a long time, because we didn't want to enforce a "right way" to play our game and scoring systems naturally imply a proper method of play.  However, without one we kept running into the problem that playtesters didn't understand what to do in our game.

The biggest issue we had with a scoring system is the game has always been about audio manipulation, and if people want to make their songs sound weird and silly, we didn't want to stop them.  We had all but abandoned that notion, until JJ and our professor came up with an idea of "genre changers" which allow the player to have some more agency over the type of manipulation that is happening.  This system has no impact on the score, so the player is still free too play with the sound while still trying for a high score.

We tried a few scoring solutions, all revolving around the idea of a streak.  We initially tried basing everything on that one number, but it got confusing when your streak would suddenly go up 15% for hitting a filter since your streak no longer meant how long you've gone without hitting an obstacle.

We have since decoupled the score and the streak, so the streak adds to your score, and other things like filters also add to your score, but the streak is always how many obstacles you've passed since the last one you hit.  It works well, but we've struggled with the growth curves, since certain actions are more beneficial to you once you already have a big score since they are percentage based, and others are just linear.  Since all of this math is hidden from the player, it's probably not a good idea to make boosting worth tons of points near the end of the song, but have it be essentially worthless at the beginning.

I ended up coding several options for growth curves and score bases into the system so that we can play test quickly with different combinations of options that can be quickly tweaked from within the Unity editor.  I think the system is well designed, we just need to find the right balance to make it feel right.

Friday, November 15, 2013

Vinyl : IGF Edition!

We have officially submitted our game to the IGF Student Competition!  Actually, we officially did on Halloween... and then a couple more times after that, but I'm bad about updating my blog.  The key things of note since submitting was our post mortem required of us by the professors and general team grooviness.

I've said for awhile now that the engineering team generally works fine, we get all our tasks done and are typically all on the same page.  The same can't really be said for the team as a whole, and we've been making some pretty serious strides to remedy that.  The post mortem focused generally on us having entirely too many meetings where nothing useful is accomplished and general communication and team morale problems.

I think my favorite remedy to come out of the post mortem is our new policy of putting everyone's contributions to the game in regardless of quality, and not replacing them until we have something better.  Before we would either not put something in the whole team didn't like it, or we would take it out fairly quickly.  This discouraged the artists and left the game looking the same (programmery) for so long that actually refining an art style became incredibly difficult.

Since then, the art team have collaborated with the programs other thesis project art guys to come up with an art style we can seriously push forward with.  This probably should have been done six months ago, since we will ideally content lock in less than a month, but its refreshing to see the rest of the team have the enthusiasm that we all had when we started again.  I think with a bit of minor crunching, and maybe leaking a few weeks passed our deadline we'll have a much better looking game.

The engineers are almost entirely working on simple implementation changes, UI stuff and minor design tweaks, so while we have our hands pretty full, it's not with the kind of things that are difficult.

Aside from that, we've also become a bit more agreeable with design changes.  Largely, I think that is due to the fact that the game is in a state that we like, so changes are very minor.  Additionally, we are trying harder to take a "implementor gets creative authority of how a feature works" approach.  Which I thought we were kind of doing before, but I guess some people thought otherwise, once again due to communication issues.

All in all, I'm excited to see what Vinyl looks like in the next month.  Check out the current IGF build at www.thevinylgame.com

Monday, October 14, 2013

Vinyl: Installer Edition!

This is going to be a more technical heavy article, which is really what most of my posts should be about.  Anywho, one of my tasks this last week has been to build an installer for our game.  Given that we are building the game in Unity, it'd typically be pretty easy, there's a very easy build system built right into the editor.  You select the scenes you want, in the proper order, click the button and let it go!

When has that ever worked?  I imagine it probably would work pretty well for a normal game, but ours is running three separate programs, two simultaneously to handle all the weird stuff we do.  To start, when the game launches, we allow the player to select any mp3 they want.  This was a problem in itself because the file dialog box we were using was only available in the editor, so just to build I had to find a workaround for this.  Once the file is selected we launch a C++ program we wrote to parse the mp3 and generate the level from it.  From there we launch the game in Unity, and PureData, an open source audio engine, since Unity's Fmod wrapper doesn't give us anywhere near the amount of control we needed to manipulate the song in real time.

Because of all these shenanigans, we are constantly referencing song and level data files that are stored outside of Unity.  Turns out Unity isn't a fan of that, their system has you throw every asset required for the game into a Resources folder that then gets written to a giant binary file.  That won't work for us, so I've been figuring out a work around.  When you build a Unity project it creates an exe and a game_data folder which is apparently uses as its working directory.  If I copy over our External folder into the game_data, everything is then in the right place.  However, several little things have been causing a problem since, such as even though the relative paths are right, I can only seem to get files to successfully open with absolute paths.  Additionally, each method we use behaves a little differently, some require forward slashes, while others prefer back.

Once I get all the file reading to work properly in the built version, actually creating the installer should be fairly straightforward.  Since this game is going to be Windows only, Visual Studio has a Install Wizard project type I can use.  I just have to tell it what folders and files are required for the install and it more or less behaves like every other Windows install wizard you've ever seen.  It's pretty fancy stuff.

Hopefully we'll have a working installer we can send out to appease the masses by tomorrow.

Design, Communication and Faculty

So anyone who has read my blog posts from the end of last semester knows that we made a lot of dramatic design decisions with the game in rapid succession, without really settling on the best and without really having time to implement any of them.  We got more or less got to an alpha build over the summer, largely thanks to Cody and Sherly, but as soon as school started and everyone was back to have an opinion on design we seemed to lose focus once again.

Our biggest problem is probably communication, because if you were to actually track what the engineers have been building over the last 6 weeks you'll see we've more or less been on a similar track this whole time.  We've have a couple weeks where we tried new things to see if they were fun, but for the most part we've been working toward the same goal this whole time, while discussing potential designs a little too loudly, that were never meant to be officially implemented ideas.

In addition to that, the art side of of our team has fallen behind due to lack of communicating our needs.  With both of those combined, every member of the team had a different idea of what the game was supposed to be, which became painfully obvious when it came time to present to the faculty last week.

With the lack of art assets and direction so close to our IGF submission deadline, everyone was worried.  In retrospect, I think the problem seemed a lot bigger than it was, but it finally got us to sit down and plan out exactly what we want the game to be before and after IGF.  After a couple hours of design discussion we decided to change how a few features worked, which makes the game feel more comprehensive as an experience and is just more fun to play.  Ironically, its quite a bit more like the original prototype than it has been since May.

On top of that we have outsourced a bit of art, put Zeph in charge of the in house artists and seem to have a bit of a content pipeline going.  Of course, fall break just hit, which put a bit of a snag in our new found groove, but all the engineers are still meeting twice this week to implement a bunch of the mundane but necessary things such as fancy menus and building an installer (which I'll write about next, because it's proving to be quite the ordeal.)

Things should hopefully be going a bit more smoothly from here on out... we shall see!

Thursday, September 26, 2013

Vinyl Shockwave!


During the last couple weeks of Vinyl development I've been implementing various gameplay features.  The biggest two have been obstacle generation and the shockwave (pictured above with programmer art).  We have already been generating obstacles based on beats and notes of songs for a long time, but it was thrown in quickly and the different types of obstacles could prevent you from hitting the others.  For instance, a ball could spawn at the same spot as a chorus filter, and the player could not hit one without the other.

I went through the generation code, and made sure things would spawn out of the way of other things, and added some functionality for spawning chains of filters for potential combos.

The way cooler thing I worked on was the shockwave.  Now when you jump out of the halfpipe, when you land everything within a certain expanding radius goes flying up and out of the pipe.  It's pretty satisfying.  As of right now, it just radiates outward from the initial impact of the player to the designer specified max radius over the specified time, blasting everything it hits out of the way as it goes.  I threw on the fancy static ball shader for now, so that we can visualize why the obstacles are flying away.

Wednesday, September 11, 2013

Vinyl After the Summer

Alrighty, Vinyl got a pretty major overhaul over the summer.  Basically most of the work we wanted to accomplish last semester, but had no time to do is now done.  Big props to Sherley and Cody for doing the bulking of the programming since the rest of the engineers had internships and were out of town.  I managed to throw together a useful main menu from Montreal, and sit in on several meetings via Skype, but for the most part I was busy with internships and had dodgy internet everywhere I was staying which made VPNing into the university network rather difficult.

Anywho, here's a short video of what the game looked like at the beginning of the semester, since I'm a couple weeks behind on these posts.



As you can see the pipe is now generated by the song itself and curves to the left like the groove in a record. There's some filters that carried over from the prototype and scratching, as well as a remixing system and some grinding rails that I believe were cut before this video was made.

Since the semester started we've been doing a bit more of an art pass, building the style we want for the game and bug fixing.  I personally fixed a few bugs involving the mesh generation and primarily worked on getting a rope made of tiny boxes to tether the player to the needle.

Here's an updated screenshot of the visual changes since the semester started.

The game is starting to look a little less like a prototype now, but still has some of the design/team issues I blogged about last semester.  We're working out the kinks and should be ready with something cool for IGF.

Year 2 Start and Summer Recap

So this blogs been pretty inactive for awhile.  Last semester was ridiculously hectic, getting worse and worse until finally coming to a sudden early end when I an internship offer from Ubisoft Montreal with extremely short notice and starting halfway thru finals week.  I managed to wrap up everything I was working on and get to Montreal with about 12 hours to spare, but the blog suffered a bit for it those last few weeks of school.

Anyway, the Ubisoft internship was really fun and I learned a ton about enemy AI, or at least how UDK does it.  It should probably be noted that this wasn't a traditional internship, it was kind of like a student project on a grand scale supervised by Ubisoft, that I got by competing in their game prototyping competition.  We basically built the winning teams prototype into a full fledged hour or two experience in 8 weeks using UDK.  It was an intense two months, but the end result was fantastic, and I was blown away by what some of those art students up in Montreal can do.  Here's a teaser video that has a download link in the info if you're interested.


Since that internship started so early and was only 8 weeks, I was done by the end of June with two months of summer left, so after taking a week off I headed down to Baltimore to intern at Zenimax Online working on the Elder Scrolls Online.  This was a more traditional internship and I spent the next two months building internal tools for the QA team.  It was great to see how a real studio operates from day to day and month to month and the game looks great.

The best part of these two internships is all the talented and also connected people I met.  I'm no longer worried about being able to find a job come May.

Onto this semester!  I know we're approaching 3 weeks in already, but the end of my summer was almost as hectic as the beginning.  I basically worked at ZOS until the Friday before school started, got back to Utah Saturday night, moved into my new house Sunday, started my new job as a TA on Monday and class started Tuesday.  And the homework for the first week held back no punches, I'll talk more about that when it's done (hopefully this afternoon) and make another update about Vinyl shortly!

It's good to be back!

Friday, April 26, 2013

Real Time Graphics Asg12

Check out my shadows, yo.  This assignment was easier than I was expecting.  The thing I had the most trouble with, even after questioning it in class and discussing it with Chris, was how the fragment from the shadow map could possibly correlate to the projected view.  It wasn't until I wrote the shader and saw what values we were actually comparing that I realized how it worked.

Anyway, here's my shadow map. 
Other problems I ran into mainly consisted of setting the directional light up properly.  Turns out that if I had just followed the writeup from the beginning I wouldn't have had a problem.

Here's the source code.  Sorry, I don't have more to say, it's 1am and this is the last thing I have to do to finish the semester.  It was brutal, but fantastic, learned a ton, thanks so much!



Monday, April 22, 2013

Catch up.

As I mentioned in the last post, Vinyl's had a lot of design changes this semester, but we had a few problems.  These mainly arose because the poor timing of a lot of big events this semester that pretty much killed our teams work flow as well as a someone nebulous design plan.

Mike and I from the beginning wanted the game to be everyone’s game.  We know how not having interest in a project can kill your desire to work on it, we saw a lot of that in cohort 2 and wanted to try to prevent that.  So from the beginning we tried to incorporate everyone into the design process.  This worked fine with a small team, especially since there were only ever two or three of us vocal about any particular mechanic.  Once we blossomed into a team of eleven however, every design meeting became a smorgasbord of new features.  People would talk to whoever was nearby about possible ideas, some would make it through the grapevine and become solid ideas without ever going through official channels.  Having a more organized set up for managing what ideas are new, how they fit in the game and if we want to prototype them would have been helpful.  Perhaps, in addition our daily standup meetings we should have had a daily design meeting where everyone spit out whatever new feature was on their mind and if we wanted it in or not.
Still, with the chaos of half or more of the team being gone for the majority of the alpha phase (if you can even call it that), I feel this sort of nebulous approach kind of worked.  Since there were so many ideas floating around, it gave the engineers that were around plenty of things to tinker with, without having to worry about an actual end goal.  Essentially, we were still in the prototyping phase for most of the semester, which made sense considering we didn’t have the manpower to go into full blown production.

Since we spent so much more time prototyping we now have a lot of cool features available to us loosely linked together.   To help with this nebulous approach to design and since we feel everyone has now had pretty ample say in what the game is, we are in the process of appointing two designers to have final say on features, with a product manager to keep us on track for who our target audience is.  We still want to take everyone’s input seriously, but we need a clear focused goal.  Also, now is a good time to make this transition because we seem to be at a place where everyone is pretty excited about the general vibe of the game, therefore maintaining that feeling should appease the masses… in theory.
Right now we are thinking Mike and I for final say in design, but the decision is still up in the air.  What we want is two designers to sit down and decide what will and will not be part of the final feature list before the summer starts.  This is because we are pretty far behind and if we want to have something presentable by IGF we have all agreed we have to work on this over the summer.  Most of the team will be around and available to work on the project this summer, and they are thankfully the people I have grown to trust to actually get the work done.

If all goes to plan, we should have all the major features, including the few massive undertakings I mentioned earlier, in some sort of alpha stage before school even starts next semester.  Therefore we should be well on our way to having a reasonable beta by October.

Tuesday, April 16, 2013

Vinyl Over the Months

I've been painfully aware that my blog posts this semester have been sporadic and pretty barren, and figured I'd take this time to actually make a thorough post about the design evolution of Vinyl and my next post will be about the problems we've had making progress over the semester what we plan to do to remedy that.

First Design

So the initial idea for Vinyl is pretty much nothing even remotely related to what the game is now.  Way back at the beginning of the semester everyone in our cohort was charged with pitching a game idea to be a potential thesis project.  One of our professors made the comment that he really liked music games so it might help our chances of getting chosen.  This sounded great to me, I've always wanted to make a music game.  So I got together with Mike and we tried to come up with an idea by looking at every cool music game we knew of.

Naturally that didn't work out that well, but a few days later I came up with the idea of turning the grinding mechanic from Sonic Adventure 2 into some kind of music game where you are grinding on the strings of some sort of instrument.  The idea was by jumping from rail to rail dodging obstacles you could essentially play the string part of the song.  This blossomed into a few other ideas such as flying through a brass instrument or bouncing off drums.

Second Design

So pretty much the only ideas from that game that made it over was that it's a music game, you can grind and it was still sort of a forced runner.  Or at least that possibility was solidified.  At this point Mike came up with the idea of being inside a record.  The player would grind along the ridges and do similar things as before.  Oh wait, I lied, we tried to incorporate the older instrument ideas as a sort of bonus section.  The problem with this was by essentially being the needle on a record, you would be in the groove and not of the ridges in between.  This problem of how far we pushed the actually being a needle in the record versus just breaking the rules of how records worked kept recurring.

Third Design

When we officially started prototyping we decided it would be better if we were in a groove and the game played a bit like the special stages in Sonic the Hedgehog 2.  The game got much simpler at this point, we went back to the roots of what made music games interesting and incorporated synchronizing gameplay with music.  To do this we spawned enemies based on notes of the song and sped up the music and slowed it down according to the speed of the player.  Some of this may have been in the second design, I'm a bit fuzzy.  The main point of this iteration was to emulate being in a record, while still being fun and not destroying the song too much.

Fourth Design

Our current plan is very similar to the fourth.  We have just included a few more mechanics, like grinding on the edges of the halfpipes.  We decided after prototyping that we were almost exclusively making the song sound terrible and that we wanted to look more into enhancing the music.  We found ways to do that from games like the new SSX.  Also, to further solve the problem of what a needle in a record is actually doing, we decided the player should be pulled by the needle like a wake boarder.  Therefore he can influence the needle a little, but the needle never leaves the groove, but the player can.

I'm pretty happy with the ideas for the game.  However we were supposed to ideally be design locked ages ago.  We're currently prototyping these new ideas out and plan to be design locked by the end of the semester so the students available can try to get us into alpha over the summer.  Next post I'll discuss how we got behind.

Thursday, April 11, 2013

Real Time Graphics Asg11

This was a cool assignment.  It was nice to have an easy one after the ridiculous month I just finished.

The only real problem I encountered was copying and pasting handle code around that set things using the wrong constant tables and what not.  This actually led to a cool picture in picture effect I hadn't planned on.  I ended up setting the GUI texture as the already rendered back buffer texture.  I think at one point I even had it stretched over the whole screen so I was rendering everything to a texture, then copying that over pointlessly for some reason.

Other than that the assignment was pretty straight forward.  When you're done with everything, save it to a texture, then draw stuff onto that texture then display it.  Now that I think about it though, it seems like we should be able to just draw things directly onto the render target rather than copying things around.

Anyway, here's what it looks like with a kiwi as a health bar and a black vignette effect.  I also changed the clear color to green so that you can actually see the vignette effect.


And here's the code.

Real Time Graphics Asg10

Check it!
You see that crazy transparency on the blue spheres as they get close to the wizards face?  Yeah, I totally did that.  You can too, the arrows move all the spheres around, but now not the floor!  As you get closer to the floor it gets pretty transparent, I went a little overboard with the effect so you can really see it happening.

This assignment was kind of a tricky one.  I had a lot of problems with the depth buffer and what to do with it.  First I trying to copy the texture back to the actual DirectX depth buffer, which triggered an invalid call error.  I was on an airplane back from Montreal while doing this so googling the problem was kind of impossible.  Eventually I realized there was no reason to actually do this, since we can just sample from the texture.  Additionally, when to set the render targets and where to set them back had me staring at a beautiful black window for some time.

Here's what my depth buffer looked like by the way, and right afterwards is the actual depth buffer.
The biggest problem I encountered was something I didn't even think was a problem with mine until right before I solved it.  Other Jason was having this weird problem where only part of his spheres were showing up and when moving the spheres they went crazy.  My spheres weren't rotating and I had never moved the meshes, only the camera, so I just assumed mine worked and spent my time trying to solve his problem.  Turns out we were updating the meshes positions after we did the depth pass, which obviously you don't want to do.  We were just calling update on every mesh first thing in the draw loop, but since we added the depth pass before that loop we didn't think to move the mesh update above it.  Luckily I figured it out right around the same time I realized it was even a problem for me.  :)

One thing I kept wondering while working on this assignment was how when implementing the wobble technique we were told that games only ever do that effect once.  I figured this was because writing the entire screen to a temp buffer, then copying that over and manipulating it to make an object look like its effecting the background is really costly.  But sort of do that a second time in this assignment with the depth buffer (although we never copy it back) and we definitely do it again in the next assignment with the GUI stuff.

Anyway, here's the code.




Sunday, April 7, 2013

Montreal!

Woo, busiest month of my life is now over.  I get to relax by catching up on the last 3 weeks of homework I had to neglect because of the Ubisoft competition and GDC... and flying to Baltimore for a day for an interview.

The Ubisoft competition was cool though, there were some ridiculously good looking prototypes there.  Guess that's what happens when your team of 8 has 6 artists instead of 1.  That said, we probably had cooler mechanics than any of them, or at least more complex, because we were so programmer heavy.  We definitely had more gameplay features.  We find out on Thursday who won what things and who will get the internship.  Also, it turns out I don't understand french so I had a hard time following most of the presentations.

Montreal seems like a cool city, or at least the 3 blocks of it we got to see.  Alcohol flows more freely there than in Utah at the very least.  Though alcohol flows more freely everywhere than in Utah.

Even if we don't win though, this competition was a pretty awesome experience.  It was interesting doing an even bigger prototype than we'd ever done in class, in a new and not even remotely user friendly engine, while simultaneously going to work and school, with a bigger team that usual to boot.  I may have bit off a bit more than I could comfortably chew this semester, and I can chew a ton, but it was definitely doable and I still got some skiing in.  Makes me happy knowing that next year will probably be substantially easier than this one, cause I'm pretty burnt out.  It's gonna be nice to get paid during crunch time rather than paying tuition to be in crunch time.

Anyways, not much more to say, at least not more than I'm willing to say. ;)  I think this experience will help when making Vinyl next year.

Sunday, March 31, 2013

GDC and Vinyl stagnation

I just got back from GDC.  It was pretty cool, and I can check it off my list of big gaming conventions I'd like to attend one day.  Just leaves E3 now and I'll have pretty much all the big ones, including TGS.  That said, pretty much no one seemed too interested in hiring interns this year.  I had way more fun just finding triple A studio programmers and asking them questions and picking the other big schools students brains about their programs.  USC, Digipen and RIT were all there in pretty full force.

Next year should be awesome though, since everyone seemed pretty interested in hiring entry level stuff, and Vinyl will be almost complete.  Also, I plan to buy the nice pass and actually go to a bunch of the talks.  I went to one, but it was more about selling their product that automates LODs than talking about how they implemented their software.  Still, looked like a cool product.

Anyways, the reason I mentioned Vinyl stagnation in the title is because sadly, we haven't gotten any real work done on it since the industry panel.  One thing led to another and there just hasn't been any time.  Basically due to delays we didn't even do the industry panel until the week before spring break, when it should have been 3 weeks before.  So by the time the teams were settled we had a week off.  Then a week on again before GDC, and now another week after that where half our team, myself included, will be in Montreal.  That leaves about 3 weeks left in the semester to get into alpha.  Looks like we'll be doing a bit of  summer work.  That's okay though, I'm still more excited about this idea than any of the other projects I'm working on.  They just sadly had to take precedence for the time being.

Speaking of which, the main thing that has been occupying my time lately is the reason we're going to Montreal.  A bunch of us are in a game prototyping competition that Ubisoft Montreal is putting on.  Since we leave on Tuesday we've spent the last two weeks crunching out the end of the prototype.  Last weekend we went from a tech demo with empty levels to basically a full blown game in about 25 hours of work time.  This weekend, shortly after arriving back from SF we polished everything we could and tweaked a few more things.  The game looks amazing now and I can't wait to show it off in Montreal.

Still, I can't help but thinking maybe I put a bit too much on my plate this semester.  4 weeks til summer though, can't wait to just have a job for a few months.

Wednesday, March 20, 2013

Real Time Graphics Asg09

It's so shiny!!

Seriously, look at how shiny that dudes face is, it looks especially creepy on the floor.  It's like you can almost see an entire galaxy in his face, or maybe just a nebula, I dunno.

Anyway, this assignment was fairly straight forward, just ran into a few bugs.  Mainly fixing the camera to work how it was described in the email broke a few things.  I believe the problem was resolved by negative a value.  I'm not entirely sure why we needed to change anything, since from what I understood we were just passing in extra data that we didn't need originally.

Other than that the environment map worked just fine.  DirectX has great support for it.  I wonder, do people invent these technologies and then DirectX includes them in the next major release or how does that stuff work?  I thought environment mapping was a fairly new thing, but we're still rocking DirectX9, which is almost 3 years old now.  Then again, the teacher mentioned Shenmue 2 doing it and that was like 2002...

Moving onto the second part of the assignment.  Making anything behind the first sphere wobble according to a tangent function.  I dunno why we like tangents so much, probably cause of the asymptote that breaks up the squiggles.  In order to do that we had to all the non transparent geometry, save that to a texture and then draw the transparent stuff using that texture as a background.  Then copy it all back and actually draw it to the screen.  Here's the screenshot showing the texture before the transparent entity in PIX.

I was having trouble figuring out where to place the draw copy it back code, because when I stuck it in my transparency if checks it never fired since none of the entities were technically marked as transparent.  This led to a nice black screen, even though I could see all the stuff being rendered properly in PIX.  Sherly helped me modify my if check so that I got everything but the floor, turns out I was now copying the texture over to the backbuffer, but was doing it twice so I overrode the floor with the second entity.  It was kinda neat to see each draw call in PIX and watch the exact moment the floor went bye bye.  Fixed that one with even more modifications to my if check.

Those were the only real problems, the actual graphics part of the assignment was pretty straightforward.  Which is odd cause they are both pretty bizarre ideas.  Reflecting something that isn't actually there has a surprisingly cool effect that is very noticeably different from just specular.  I assume you would generally want the environment map to mirror whatever is in your skybox though.  I was also blown away by how we achieve the squiggly ball affect.  Literally saving the entire frame and then drawing over it seems like a ridiculous amount of extra work to do every single frame.  That said, we are essentially doing that anything with each mesh being drawn on top of the previous one, but we're only drawing the fragments we need each time.

The teacher said we could only do it once or it'd get too slow, but I wonder if we could pull it off again considering how little we're actually doing in our scenes.

Anyway, source code!


Thursday, March 7, 2013

Real Time Graphics Asg08

This weeks assignment was pretty cool actually.  I always had an inkling of what bump mapping was from being around games all the time, but upon actually implementing a simple normal map I'm a little blown away by what you can do to a surface by just altering some normals.  Here's the assignment for the random googlers.  And here's what the assignment ended up looking like.  Guess which one is specular.


I'm not entirely sure what to include in this writeup, usually I talk about all the horrible things that went wrong, but this one was pretty straightforward.  The hardest part was properly scaling the cube in Maya to output a cube at the same size as our hand coded cubes.  Changing the Maya exporter to output bitangents and tangents was exactly the same as how we were already output the normals.  The actual fragment shader code was pretty simple too, just modifying some normals with the new values.

I did find it interesting how DirectX handles multiple textures per mesh.  I figured we'd have to explicitly call SetTexture with the specific texture name, their way is much nicer.

Other than that I spent a bunch of time making new text files and cleaning up some things.  I took the specular exponent out of the scene file finally, even though I wasn't even using it.  Additionally I decided on an expected texture type of 0 to be just a texture and 1 to be texture and normal .  This way I can just do a simple int compare in the update loop, might not be the most human readable though.  Spoilers, 2 will probably include environmental maps.

I also spent a lot of time figuring out how I broke my specular lighting awhile ago.  Apparently I was passing a zero in for light intensity, which when multiplied with my specular lighting went to zero.

Anyway, here's the code.

Vinyl Greenlit, Administration discussions ensue.

So Vinyl got through the industry panel, because its awesome.  Our team has now more than doubled in size.    Sadly, two of the four got cut, including Ludology, which I thought was a fantastic idea and my fallback plan in case Vinyl got axed.  Ah well, such is life, Cosigners and Vinyl absorbed the other teams and now we finally get to start planning to make a game that isn't just a prototype.

We spent most of the first week debating platforms and engines.  A lot of the feedback we received suggested we do a mobile game.  We kind of like that idea, but we wanted to use Crytek and they cannot easily deploy to anything, but a PC.  The more we debated it though, the more we decided we could make a better game on PC and that trying to hit mobile because it's a better market is pointless.  We are a student game and we should be trying to execute a cool proof of concept.  If we have some time when we're done we could look into porting to Unity and deploying a lesser version on mobile.

On the other hand though, it looks like we may have to pony up a few hundred thousand Euro's to get a reasonable Crytek license so we might end up using Unity from the beginning.  We expect to spend most of our time figuring our how to process individual song sound waves anyway, which will be engine independent, and then using that data to generate levels.

So far we've looked into a few different opensource softwares to do that and a Unity library.  Looks like I might have to brush up on my partial differential equations as well.  Who knew those things actually had a use.

I'm gonna mess around with those over spring break and report back with some actual engineering stuff hopefully.  Catch yah on the other side.

Thursday, February 28, 2013

Real Time Graphics Asg07

This assignment wasn't too shabby.  Once again the hardest parts seemed to be the sorting and setting up the  PIX calls properly.  I probably went about this the completely wrong way, but its been one of those weeks, or months or semesters or whatever.  Anyway, for the assignment we had to set up PIX event calls to make debugging in PIX easier.  Which is pretty awesome, but I haven't really had a reason to use it in awhile.  I suppose that's good, I'm sure I'll learn all the details when something goes horribly wrong as with most debugging tools.  Check these awesome buckets though.




The transparency stuff wasn't too bad.  Sorting the draw order so you could see all three cubes inside of each other was kind of a hassle, but not terrible difficult.  I now have somewhat gross sorting loops in more than enough places, but hey, selection sort on 9 items is probably manageable.

I think the thing that caused me the most trouble was actually a sorting loop that got copied and pasted from another incorrectly.  It was actually causing my transparency to not work at all, I think because certain settings were turned off at certain times.  Once I caught the error, a counting issue, everything just worked.  

Also, I had an issue with the world being incredibly dark, but I think that problem stemmed from some bigger issues.  I'm not sure my specular lighting was functioning exactly as it should.  Speaking of which, I moved the specular exponent to the materials finally, so be happy about that.  :)  I just realized I did not take the number out of my scene file though, so it is still being read in and parsed and not used.  I'll fix that later, everything is already zipped up and uploaded now though.  Anyway, back on track, my specular lighting wasn't quite working properly I thought, and upon looking into it I noticed I was actually passing a zero into the shader instead of my scene files value, I probably switched it out a bit ago as a test and forgot about it.  Upon fixing that, and a few other things, that I messed up this week, the scene went really dark.  Part of that was because of my ambient lighting being really small do to compensate for some of the other errors I had.

Long story short, there were several small lighting problems that when fixed turned into other problems.  I think I got it all sorted now.

Here's the code.  Sorry this writeup was so rambly and all over the place.  Hope you got the gist of it, nothing in particular actually pertaining to the assignment was terribly difficult this week.  Just a lot of little things that needed to be fixed.

Tuesday, February 26, 2013

Vinyl and the Industry Panel

Last night was the "first gate" that we had to get through to get our prototype greenlighted... sorta.  I kind of feel like getting chosen among the 15 or so pitched games was really the first gate, but I suppose not.  We're mostly programmers, lets call that one the zeroth gate.  Anyway, the game has progressed a ton since I've last written about it.  It currently has particle effects instead of spheres for the static, and the lasers look a little better and even pulse as they pass by the player.

Most importantly we have an equalizer now that jumps to the music and we spawn the enemies based on the intensity of the song.  It definitely needs a lot of tweaking, some songs do almost nothing and others spawn absurd amounts of enemies, but it was definitely effective for the prototype.  Here's a video of someone playing really well, you can see it get crazy when the metal part starts.

http://youtu.be/icjEKQDvM-8

We also have a video of someone butchering Thriller since changing your speed affects the pitch of the song and moving side to side affects volume.

http://youtu.be/HjB0d-RGBAs

The game is pretty fun to play, it just needs some further refinement and a probably a few more features.  And a real artist, right now its pretty much been Mike and I throwing colors into the mix.  If we get past the first gate we have some big plans though that I'll cover in another post.

The first gate went really well, but we'll find out for sure on Thursday which games made the cut.  Everyone put forth a fantastic presentation and had some great prototypes, so it'll be interesting to see what ends up happening.


Thursday, February 21, 2013

Real Time Graphics Asg06

This assignment was pretty cool, it was the first time I've ever dealt with reading and writing binary files and it's a lot simpler than I thought.  I still encountered a infuriating bug that was really hard to fix since everything was in binary, but finding it was pretty easy.  Fixing it ended up being easier than what I was trying to do in the first place anyway.  I'm still baffled by the stuff you can do with C++.  This ain't your grandmothers programming language... then again it might be old enough to be.

Anyway, onto the graphics, this assignment was more about sorting and optimization than anything.  We had to sort our entities by their shader and texture in an effort to minimize the slow swapping operations in the update loop.  As you can see by my lovely screenshot here, I set the materials together, the effects together and then all the entity stuff.  I basically just sorted the entity array after I parsed it in.

The actual fun part of the assignment was writing the Maya plugin.  Well, more like writing a method of the Maya plugin.  It was still awesome to be able to model something in Maya and export it into my customized format and be able to parse that into my own graphics system.  I have a feeling when it comes time to add fbx support to our game engines renderer next semester we'll have a lot easier time.  Nothing really bad happened with this section of the assignment, except the Maya install on my laptop apparently broke between now and December when I used it last.  I couldn't hit the browse button on the plug in manager or the export all button or most menu buttons without the program immediately becoming perpetually unresponsive.  It wasn't the plugin either, I couldn't even get that far.  I ended up just using my lab computer.

The only big problem I ran into was the one I mentioned above.  Since I'd never written or read in a binary file before I was a little confused as to how to read it back in.  I ended up trying a way Sherly showed me, which confusingly only worked some of the time.  It worked flawless for my cubes, and for some of the simpler shapes I exported from Maya, but one of my shapes only drew a single triangle and the torus read in a negative vertex count.  It's hard to instantiate arrays with a negative size.  Anyway, her method worked for her, and involved some bitwise ors to convert a char back to an int, but Cody showed me a much easier method that works 100% of the time.  I'm thinking Sherly's method didn't work from machine to machine due to big endian and little endian differences, but I'm pretty sure she tested hers on the same lab machines we were using so I'm not sure.

Anyway, it was a pretty neat assignment, here's the code.  Oh, and one thing to note, the changes I made to   MeshBuilder in order to write the files to binary no longer overwrite existing files, so if you try to change a mesh in the Assets folder, you'll currently have to delete the mesh files out of the temp/data folder.  Sorry about that, I'll find a fix for that next week, it's probably as simple as using another method out of that library that has CopyFile instead of using fstream.

Thursday, February 14, 2013

Real Time Graphics Asg05

This assignment was pretty cool.  It's nice seeing all that effort we had to put into this pipeline initially allow us to expand the scene so rapidly.  Just look at all these cubes!
Complete with specular and diffuse lighting.  The four on the left are specular, as you can see by that awesome glare on the left rubix cube cube, while the right four are only diffuse.  Also, the front four use 24 vertices with proper normals, while the back four use the mesh that is only 8 vertices and averaged normals.

The controls are the same, WASD to move the camera, IJKL for the light and the arrow keys for the meshes.

I'm not entirely sure what to talk about this time around, specular lighting is pretty awesome, but the math behind it is pretty simple.  It made for an easy assignment that consisted of maybe 30 minutes of graphics related work and maybe 4 hours of creating and manipulating text files and then debugging said text files.  I ended up using the message logging system a ton to figure out where all my errors were though, so I'm glad I kept up on our teachers policy of logging errors everywhere.

One issue that kind of confused me is the specular lighting doesn't appear to be working on the back left two cubes that use the averaged normals.  Supposedly it should work better,and it's hard to tell because they are so far back, but I don't think it's working.  You can rearrange them pretty easily with the text files if you are interested.  I don't get how the lighting can work for the other meshes, but not for all of them.

Additionally, I set up my code so that the specular lighting exponent could be read in from the scene file before noticing I was supposed to put it each individual material file.  This makes sense so we can specify how rough or smooth and shiny various materials are, but I don't have the sheer force of will to go back and change all the code and add the getters and setters to make that work right now.  I'll probably switch that up next week or when we do some bump mapping (we are doing bump mapping eventually, right?)

Oh, one last question I forgot to bring up in class, but I found interesting.  I chose to pass the world to view transform into my specular fragment shader cause it was easy, but it seemed like it'd be faster and more efficient to output it from the vertex shader and intercept it with the fragment shader.  Then you'd have a vertex shader outputting information that not every fragment shader needs however.  Is their a better choice for handling passing this data?

Here's the source code!

Tuesday, February 12, 2013

Colors Everywhere!!!

Well, Vinyl now has an overwhelming amount of color in it now.  We got put fancy bright colored texture on the half pipe that repeats to give the player a sense of motion.  Before you couldn't really tell if you were moving since it was just a super long gray halfpipe with a fixed camera.  The balls rushing toward you helped, but the colors makes it feel a bit more frantic and hip.  We also randomly generate colors for the balls.  Ideally these will turn into some sort of electrostatic particle effect, but that isn't really necessary for a white box prototype.  Lastly, we added lasers in that are really just long cylinders that you cannot pass through.  They are forewarned by a red ball about a half second before they show up.  However with the multicolored other balls it's a little hard to tell right now.

In addition to all that Shirly implemented a lot of the musical features of the game.  We couldn't natively make Unity force audio out of the left or right speaker, so for now the farther you get away from the center of the halfpipe the quieter the music gets.  Additionally the speed at which you move affects the songs playback speed.  It's kind of a neat effect, but its still a little wonky.  Definitely works better with some songs than others.

Right now we're figuring out how to analyze the songs using Unity's libraries to actually generate the obstacles.  Also, we'd like to implement a rewinding function.  I have an idea about how to do it and will hopefully get that working on Thursday.

Here's a screenshot of all the colors.  This should hopefully clarify a bit of the details of what I'm talking about.


Oh, also we added some fun lasers in the background for a cool visual effect.  It was kind of an accident when we implemented the red lasers, but it looked cool enough that its not a bug, its a feature!

Sunday, February 10, 2013

Real Time Graphics Asg04

So this assignment ended up being way harder than I thought.  Here it is.  This time we had to update our content pipeline so that we could generate all the relevant graphics data from text files.  Now with minor changes to some text I can add dozens upon dozens of cubes in of various colors and textures with colored light and intensity.  It's pretty much the greatest graphics content pipeline ever, objectively speaking.

Seriously though, the assignment had very little to do with actual graphics stuff, but ended up being kind of interesting regardless.  It was a lot of string parsing, which is the devil, and took forever, but that stuff is pretty boring and not too challenging so I don't really feel the need to talk about it.  It did take me like ten hours for that chunk alone though.

The reason it was interesting was the remaining 5 hours, you know, those 5 hours that you expect to take about 1, but it never really works out that way.  Once I got all the parsing stuff in, figuring out how to modify the current renderer class that our teacher built to display multiple sets of vertex data was quite the ordeal.  I ended up wrapping most of the LoadScene, ReleaseScene and Update methods in loops that basically did the same thing for each cube I had to draw.  The whole process didn't seem quite right, but by this point in the project I was somewhat out of steam.  I'll probably look into rearranging some things and optimizing in later assignments.  In fact, I think that's going to be an upcoming requirement of a new assignment.  :)

I also spent a lot of time trying to figure out the best way to pass data up from the parsers and into the eventual DirectX calls.  The other Jason and I had a pretty big debate about some aspects of it.  Mainly because he wanted to copy all the vertex and index data into the buffers we were already using and not change the DirectX calls.  I figured we'd save space and time by just passing the buffers we populated during the parsers directly.  After about 3 hours of trying to make it work my way, I gave up and did it his way.  I think I got really close, cause I found the biggest culprit of the errors I was getting was a typo in my mesh parsing code.  Still, at some point you just got to cut your losses.  I had other stuff to work on and I did meet all the requirements.

Anyway, despite this assignment not having much to do with the high level graphics stuff we always discuss in class, it forced me to be way more familiar with what all those DirectX calls were actually doing since now a lot of them are done three times.  Not all though, turns out clearing the screen 3 times per Update call is a bad idea.

Here's my PIX depth screenshot by the way.  I think this part of the assignment took about 7 minutes, but its definitely noticeable.
And here's my source code.

Tuesday, February 5, 2013

More Vinyl Changes and Progress

So I mentioned in the last Vinyl related post that we were making some substantial changes to the design.  It sounds like everyone in the cohort did because we're now getting an extra week to work on a prototype.

Anyway, the essence of the game remains the same, but to further emulate being in the groove of a record we decided to put the player in a halfpipe like Sonic 2's special stages.  This allowed us to brainstorm a lot more record-like abilities and functionality.  We still want to effect the stereo sound based on the side of the halfpipe you are on, but are running into technical difficulties with Unity for that, in that Unity doesn't seem to be capable of doing it.  Additionally we came up with ideas such as a player usable "scratch" that rewinds time a bit and of course plays the "wiki wiki" sound.  We'd also like the pipe to curve and drop according to the beat.  I think the coolest, but possibly the hardest feature to implement is changing the point of the song based on if the player gets to far up the side of the pipe and falls into the next track of the record.

What we got so far is fairly simple, we have a player cube that can go up and down a halfpipe while randomly generated balls rush past it.  Here's a screenshot.
The hardest part was getting the cube to actually move along the curve of the halfpipe.  We tried a number of techniques, but ended up settling on a rotateAround method that is currently using the origin.  This isn't going to once we start changing the pipe's direction, but it should suffice for now.  We shouldn't have to change it too much once our levels get more complex.

Thursday, January 31, 2013

Real Time Graphics Asg03

Now that I actually understand the graphics pipeline a bit better, this assignment was quite a bit easier.  I understood immediately what the light class would look like and understood how everything was going to fit together.  It was just a matter of figuring out the right method calls and changing the Mesh file into something more readable, you're welcome, I think string parsing in C++ actually physically hurts me.  ;)  Anyways, here's the assignment for the randoms reading this.  Basically this week we had to texture our cube and apply lighting.

The controls are the same as before, WASD controls the camera, the Arrow keys control the cube, and IJKL controls the light source.

The troubles I ran into this week were all kind of trivial.  First of all, I don't know if this was brought up in class, but the UV coordinates originate from the top left.  It seems like each individual aspect of the DirectX graphics pipeline uses a different origin..

The second issue was I assumed the D3DCOLOR object in the s_vertex was size 12 because its three ints, but I guess I was wrong.  That caused my texture to be drawn horribly wrong because I used the wrong offset into the struct.  That took awhile to fix, really Sherly figured it out.

Another aggravating issue is I used the word texture in my fragment shader, which must be a keyword or something.  I should probably get a syntax highlighting plugin.  The error just said something like 'unexpected token texture'.  I just changed it on a whim to texture_sample and it worked.

Finally, I couldn't tell if my light was actually moving at first because moving it at the same speed as the camera and the cube.  I'm guessing changing it by 1 has such a minuscule effect on the angle hitting the cube that you have to move it really far to notice.  I changed the controls to offset by 10 at a time and it works much nicer.

Here's my required screenshot.
And here's my code!

P.S. Jason and I worked together so that's why we wrote about the same issues.


Tuesday, January 29, 2013

Vinyl Tech Stuff


So the only substantial thing that happened last week was technological stuff.  The cohort of students ahead of us in the program made a forced runner thesis game in Unity and I asked them some questions about how they did it.  Turns out they used this pretty fantastic library called iTween that does a lot of animation type stuff and even music manipulation.  I spent awhile this week learning my way around the framework and got a cube spinning around a record.  It sounds like we'll be changing the design idea around a bit this week though, so I didn't go too much farther.  iTween will work just as well for the new idea however and hopefully we can get started tomorrow or Thursday.

I'll discuss the details of the changes in a later post and maybe some screenshots of what we got.

Thursday, January 24, 2013

Real Time Graphics Asg 02

Woo, check out my wonderful trippy as balls cube.  That took a whole lot of work.

You can control the camera with WASD and the cube with the arrow keys, it works just like the instructors and it spins constantly.  Most of the code ended up in cRenderer, it's kinda gross in terms of organization at the moment, but until I really see what's going on and where we are going with this class I'm gonna leave it as is.

So this assignment was pretty dang hard.  I understood all the concepts of the various view spaces discussed in class, and understood what we were supposed to implement, which I'll go over here to humor John Paul.

Basically we created a cube in model space, converted it to world space in the shader, converted that to view space and finally to projected space and sent that to the fragment shader.  It made plenty of sense and I've had a lot of 3D math experience, but actually writing the code was a whole other ordeal.  I relied very heavily on the video of the lecture since a lot of sample code could be gleaned from there.  I had the most trouble figuring out how various components communicated with each other though.

For instance assembling the index buffer was easy, I drew a diagram, counted the vertices and set it all up using the already written vertex buffer code.  It's almost identical, you just have to change a lot of method calls to the index equivalent.  Making the two buffers play nice was another matter that actually ended up being a simple as changing the draw method, but it wasn't immediately obvious.

The hardest part was figuring out how to make the camera communicate with everything.  Setting up a class was easy, but it ended up just sitting there until a classmate showed me how to integrate it with the world to view transform, which in retrospect makes total sense.  View to projected on the other hand just takes weird unintuitive parameters.

Once everything was built, I still had a lot of debugging I had to do.  When you copy and paste code, things don't always get changed as they should.  The most frustrating part was seeing a perfect cube drawn in the PIX Mesh tab like so, but seeing nothing in my fun little window.
Turns out I set up my camera's translate function to equal an offset rather than plus equal an offset so it was perpetually trapped inside the cube where nothing was rendered.  I had to set some culling mode that a classmate showed me so that I could at least see that things were being drawn before realizing the camera translate bug.

This was an awesome assignment though, learned a ton about the graphics pipeline and I can't wait to try out this lighting business.  After all that's why they're called shaders, I assume.  Here's the source code.

Monday, January 21, 2013

Vinyl is a go!

So the professors revealed last week that they won't be choosing our games, but we will.  There were a few stipulations, but basically the teams must be a minimum size of 5 and max of 12.  After that, pretty much anything goes.

I was incredibly impressed by the sheer number of students in our program pitching and the overall quality of most of them.  As you can gather from this title, my game was one of the few chosen to be made, but during the pitches I kept thinking I'd be happy to work on almost any of the games pitched.  Nothing is official until tomorrow, but it looks like the three games will be prototyped over the next few weeks.

Cellblock is an asynchronous game about a hacker trapped in a prison cell being rescued by a solider.  The soldier player will play through an FPS style game, working his way toward the hacker, while the hacker helps him fight through the prison remotely by disabling security cameras and what not.

The next game has no name yet, but focuses on physics based combat and/or platforming.  I helped work on a prototype for the basis of this game last semester and I'm really excited it got picked.  Cody, the lead on the project further refined the design and pitched a very compelling idea that could become a a fun combat game or perhaps a platformer, or maybe even some combination of the two.

Vinyl, was my game which I described last week.  I can't wait to get started on the prototype, I've always wanted to make a music game and this will be a great opportunity with a very talented group of engineers.  The current plan is to prototype in Unity, and then decide on which engine we want to really build it with sometime in February.  Unity would be ideal for deploying to multiple platforms, but I'm not a huge fan of the lack of flexibility the engine offers.  It's great to get something up and running fast, it seems pretty terrible to get anything very specific working at all.  We will see I suppose.

Thursday, January 17, 2013

Real Time Graphics Asg01

So I found this first graphics assignment very interesting seeing as I'm completely new to shaders.  I ran into several interesting problems so lets just start from the beginning.  For random people who stumble upon this blog, here's the assignment assuming it stays up awhile.  For the TL;DR crowd, we basically had to take our teachers code base that draws one triangle and runs it through a fragment and vertex shader and change it to a rectangle and modify the shaders.

First, after looking through the source code our teacher provided, I figured out how the various asset builders worked.  Since the MeshBuilder was explicitly stated to just copy a file to a new location I just took the TextureBuilder code and pasted into a MeshBuilder.  And lo and behold, I got compiler error.  Those are the best kind of errors, aren't they?  The CopyFile method call complaining that it couldn't take a character array and after a lot of tinkering I found an option to change the character type for a Visual Studio project.

The next fun problem was that the MeshBuilder was taking the file with the vertex coordinates and copying it to a path that was relative to the application built in the Debug folder, so when running the code from within Visual Studio instead of just clicking on the application didn't work.  To remedy this I put a dummy folder and vertex file in the same folder as the project solution.  It never actually gets read, but it gets rid of some build errors.

The way I formatted my vertex input file was pretty much the laziest way possible.  I had grand plans, including a triangle count and a triangle label before each set of three vertices.  Upon remembering how terrible string parsing is in C++ I quickly settled on vertices delimited by new lines and every triangle spaced by an extra line.  This surely change in the future, probably next week, but its a good start.

The rest of the fun problems mainly consisted of learning DirectX rules, such as winding order and finding where our teacher hard coded the draw primitive count to 1.  It initially also appeared that I had to draw my triangles from the top left most vertex first, but I eventually figured out that clockwise was the only requirement and that can be reversed with a setting somewhere.

When I tried adjusting the shaders I had trouble figuring out which variations did what.  I tried adjusting the fragment shader that simply passed the color to the vertex shader.  I halved the color first and then set it equal to the output color, but did not see any change to the rectangle.  In the vertex shader I changed the color and position based on elapsed time, similar to how the teacher did it, but I passed the time into tangent functions so the square would teleport from one corner of the screen to the other at the end of each pass.  Also, I got a fun little color splash that happened twice during the slower part of the tangent curve.  I tried adjusting the alpha values as a function of tangent of time as well, but found out alpha is turned off by default.  You can see the values changing in the PIX debugger, but it had no effect.

Finally, we had to take a few screenshots showing us debugging in PIX, so here they are.  For some reason when PIX ran my executable it only drew one of the triangles.  It read the second one, but set its vertices to 0 for some reason.  I never did figure out why.  UPDATE:  I figured out it was cause of a buffer overflow problem that I've fixed.  Didn't change the screenshots though. so if you can make out those 0,0 vertices that's why.



And here's my source code.

Sunday, January 13, 2013

Semester 2 Start!

So I'm back after what was the least relaxing break I think I've ever had.  Spent most of it at the new job or working on our Ubisoft prototype for this competition.  I guess it was worth it though, ours was picked so we get to compete in Montreal come April.

Anyway, this post is going to be more about ideas than whining about my lack of break.  This semester starts off with game pitch ideas.  Everyone has to pitch, or at least be involved in a pitch.  The top several get selected to be rapidly prototyped, and the top two or three of those become our thesis projects for the rest of our program.

There was a lot of talk about synesthesia, or however you spell it, and how they are potentially good games for a limited number of artists, which we have.  That resonated with me, I've always wanted to make a music game kind of like Rez so I got to thinking about potential ideas for that.

Not much came to mind at first, but as always, after giving up and trying to sleep a couple days later I thought of taking the grinding mechanic out of Sonic Adventure 2, and turning the rails into strings.  It seemed like a fun idea to jump between strings to dodge obstacles and simultaneously create musical notes.  The game would be almost like a forced runner and if played properly would result in a cool song.  My partner (Mike) and I liked the idea, but we knew it needed more so we kept thinking about it.

After a bit more thought Mike came up with the idea of replacing the strings with a vinyl record and grinding the ridge like the needle of a record player.  This led to a myriad of new ideas and even allowed us to incorporate the old ideas into a sort of "star power" meta game.  This solved the problem of being a gimmick, and lends itself nicely to a lot more musical tricks, such as balancing on the rail effecting the stereo sound.

I'll post some exerts from the full design doc once it's ready and comment on the classes reception.