Here’s how the coordinate system originally works.
- An artists draws game art with up as the default orientation.
- The physics engine has the X axis points to the right and Y axis points to up.
- The flash coordinate system has X axis pointing to the right and Y axis pointing down because it is mainly for websites
- Flash does not have any ability to form a “camera” or a “view object” which moves with any object, which means that camera tracking involves moving the whole stage. Which means the whole world moves when the view moves, instead of just one camera. This translates to having to re-calculate all the object’s position if the viewpoint were to change. Very inefficient and intensive and is the primary reason why flash is so slow with games involving moving viewpoints.
So the solution is to disconnect the physical world from the view world. Kind of like a recent physicist’s theory that we are just holographic projection of a 2D arrangement somewhere else.
I keep track of the stage a.k.a camera offset and calculate where it is in the physics engine and draws whatever it is that’s there on the stage on every frame. This saves me from the headache of having to calculate the position of every object and move them. Add some determination and I have an overhauled system where I don’t have to do 4 weird coordination translation anymore like I used to. Now, Cos X is the x projection and Sin X is the Y projection in everything… except I now have to rotate artist drawings where up is right. There is no digital solution to physical world problems. This also has the effect of removing threshold cases when using tan and arc tan by limiting the radian calculation to just cos and sin. Which are simple and continuous waveforms by themselves and should eliminate any weird AI behaviors caused by having the target at those threshold angles.
I have also finally sat down and read through this interesting article on how space combat in the real world would look like. spacecombat The current system already makes things feel like the combat is going to go that way. So I am going to keep this in mind as I flesh out the details of weapons. However I am going to leave things open so that the users will eventually be able to design and implement their own weapon systems. A game should be able to free the user’s creativity, by eliminating the question of how the process will be done and just allow it to be so while applying limitations based on physics. The most important lesson I learned from Portal.
- Implement AI
- Collapse angle, direction, position, destination, x, y and velocity into stats Array
- Major problem. Asset loading from web xml invokes a warning from adobe. Need to change this to be built into the flash file.
- Add offset for weapons position in ship
- AI calculation on where to turn is a bit off when player is positioned at the 2nd quadrant (and directly at negative y axis) compared to the AI.
- GUI health bar not decreasing correctly fully, also, health count drops to negative before counting is terminated
- Determine which class should be the one retaining weapon ID
- Actor? Ship? Player? AI?
- Core: Physics based space shooter
- GUI: Health, ammo, radar, inventory
- Graphics: B/W vector graphics
- Music: 1 track repeated
- One player ship and one ai ship artwork finished along with their xml data (Art + some programming)
Hitpoints, body existence timer and actor destruction (Hardcore Programming)
- One planet orbiting the sun. (Hardcore Programming)
- GUI placement (Art + menial Programming):
Health bar, ammo count, radar, inventory
- Booster animation (Art + Math + some programming)
- Populated galaxy (menial programming)
- Dynamic graphics loader for background? (Cleverness)
Features for next release
- Genetic AI
- Secondary gravity fields for planets large objects and black holes
- Multiple galaxies
Features for future release
- Base building
- Market systemZ