NavigaSU presentation

Posted in Uncategorized on June 15, 2011 by yamanterzioglu

finalsunumu

Advertisements

NavigaSU footage.

Posted in Uncategorized on June 15, 2011 by yamanterzioglu

NavigaSU Booklet.

Posted in Uncategorized on June 15, 2011 by yamanterzioglu

 

Finaldocumentatin

links to demos.

Posted in Uncategorized on June 6, 2011 by yamanterzioglu

these are captured on the computer’s screen while I was developing so expect low res. I’ll post some other demo featuring an actual iphone and the app itself in it.

NavigaSU iPhone app.

Posted in Uncategorized on June 6, 2011 by yamanterzioglu

Hi, I’m going to talk about my project navigasu which I talked about earlier and mention steps I took in order to finish the project. First of all an overview, my project is an 3D iphone app for sabanci university campus for navigation and info purposes which is developed in Unity 3D. My first goal was to create a path finding system where the app would create the shortest path between the users location and the targeted destination. Upon many approaches and takes for weeks, It was quite difficult for me to implement the library created by 4Ant. Actually it works perfectly but i was to write a seeker code where that code will make the program find a path between a “specifically selected” building and the user. so with that problem blocking my way to advance, we decided to drop that feature of the project and focused on the information and general map feature.

with the info feature becoming the main focus, I thought of creating unique environments for each building, in where users will be able to access info about that building and its sub branches. the users are to be guided with the help of buttons(unlike the map scene’s touch to control feature), making them see a pre-defined animation of objects and text within that environment, forming up the realm. the camera will move with the interaction of  the buttons, leading the user to the newly created environment specific to that building’s sub branch. I needed to find a way to create these pre defined animations and after days of research and trials with different libraries, I decided to use itween, which is pretty amazing.

Basically the app consists of 3 main screens which work in different ways.

There is the map screen where the user interacts with the map via touches and change colors of buildings text and etc. This is for easier interaction with the map and for the ones who don’t need to access the info environments. The second screen(scene) is the info scene which enables the users to switch to particular buildings’ information without the map screen. In this scene the user sees 2D schematics of important buildings and touching on them will make the app switch to that building’s info environment. the last one is map & info scene where both features are combined together. users are able to interact with the map the same way they do in the map scene and also touching on the buildings make the go to that specific building’s environment.

while doing those I looked up some other libraries which would enable me to implement an AR system for locating individuals. this would work like this: users will take a photo of a marker on a building’s wall which is specific for that building and that marker will carry the info of that building’s location on the apps map, specifying the current whereabouts of the user. This was an another project itself so there was no free time for me to develop it but I mainly know the basic logic behind doing such a thing so, why not do it?

One of the most important aspect about this project was the openness of Unity developer kit. I would really love to improve what I started and will improve it on upcoming months. interiors may be introduced, a communication/bulletin board would be nice. The AR function also looks appealing. ..

 

So thats pretty much it, hope you enjoy it, it’s not available on app store yet but after some retouches, it will be ready for submission.

also I would like to thank Ekmel Ertan, my instructor, Servet Ulaş, my assistant and my classmates who were there for me.

thanks.

Button Fix!

Posted in Uncategorized on January 19, 2011 by yamanterzioglu

1 day left for the jury and I solved the touch/ button issue. With a little help from our TA Servet Ulaş I was able to edit the script correctly and get the getTouch command working. the buttons are now used for changing the scene with the touch input.

here the code:

var shot:Transform;
var recoveryTime = 10;

private var delay = 0;

function Update () {
print(“test”);
if (delay>0){delay -=1;}

if (delay==0){
if(Input.touchCount == 1){
var currentTouch:Touch = Input.touches[0];

if(currentTouch.phase == TouchPhase.Began && guiTexture.HitTest(currentTouch.position)){
Application.LoadLevel (1);
delay = recoveryTime;
} print (currentTouch.phase);    //
}
}

}

project dev journal 05:

Posted in Uncategorized on January 19, 2011 by yamanterzioglu

I got approved by apple for the dev license! I already installed Xcode, the official toolkit provided by apple along with the SDK. Good thing that I upgraded to SnowLeopard couple of weeks ago 8) I built the app with its logo and splash screen and etc and did some fine tuning  and debugging and finally got Xcode to open the project and simulate/test it on my iphone. based on the output I got from the test I changed the location of my buttons oh and by the way, I solved the resolution problem of the textures, its not worth to mention it because it was so easy to change 8) Ive completed the prototype and did what was asked of me but I still want to see the buttons work with touch inputs so I’ll be working on them and the camera plugin. So far, it looks good and I’m encouraged by the process and the result. I wasn’t sure that I could pull off the prototype assignment but constant researching and learning process helped me a lot. what is left is the presentation for the final jury