Search Results

Search found 8252 results on 331 pages for 'live mesh'.

Page 2/331 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • How can I create a partition without the usage of Live CD nor USB?

    - by Ariel
    ¿Cómo crear una partición sin usar live CD ni USB? Is it possible to create a partition when using the system? When I try to do it on gParted, it seems that the options are disabled because of the disk is mounted and it cannot be unmounted because of I am using it in the system. I wish to create a new partition without removing or affecting the file system; just creating a new partition, but without the need to use a Live CD or USB. ¿Es posible crear una partición estando en el sistema? Ya que cuando lo intento desde GParted, al parecer están desactivadas las opciones porque la unidad está montada y no se puede desmontar ya que estoy usando el sistema. Quiero crear una nueva partición pero sin quitar o afectar el sistema de archivos; sólo crear una nueva partición, pero sin live CD o USB.

    Read the article

  • Dynamic obstacles avoidance in navigation mesh system

    - by Variable
    I've built my path finding system with unreal engine, somehow the path finding part works just fine while i can't find a proper way to solve dynamic obstacles avoidance problem. My characters are walking allover the map and collide with each other while they moving. I try to steering them when collision occurs, but this doesn't work well. For example, two characters block on the road while the third one's path is right in the middle of them and he'll get stuck. Can someone tell me the most popular way of doing dynamic avoidance? Thanks a lot.

    Read the article

  • How do I change the keyboard layout to a non-standard one on a Live (USB) session?

    - by Agmenor
    I am running Ubuntu 13.04 in a Live (USB) session. My physical keyboard layout is called Bépo, it is the French language Dvorak method-based layout. I would like to change my input layout to this too. To do this, I tried booting in a French spoken session, then open the Keyboard Layout preferences app. Normally, to add a layout, you should click on the + sign and select your layout. However the list that appears is very short and does not contain what I want. On the contrary, on a persistent non-live installation, the choice of Bépo is present. This is also the case during an installation of Ubuntu. So I do I change the keyboard layout of my live session to the correct one?

    Read the article

  • Windows Live Messenger for Mac?

    - by studiohack23
    I have a friend who uses a Mac, and was wondering if there is a version of Windows Live Messenger for Mac? Or something comparable that uses/takes advantage of the Windows LIVE ID? I'm interested in recommendations, as well as "is there a Mac version of Live Messenger? Thanks!

    Read the article

  • Mesh Remote Desktop crashes 64 bit Windows 7 VM (8 GB)

    - by Andrew J. Brehm
    I have a Windows 7 VirtualBox VM (64 bit, 8 GB) on a Snow Leopard host (64 bit, 24 GB). It works fine until I connect via Microsoft Mesh. When I connect via Mesh remote desktop, the VM crashes about one or two minutes after the connection has been established. It doesn't answer to pings (from the host and from other machines in the network) and no RDC connections (from other Windows machines in the network where Mesh works) are possible. Any ideas?

    Read the article

  • SnagIt Live Writer Plug-in Updated

    - by Rick Strahl
    Ah, I love SnagIt from TechSmith and I use the heck out of it almost every day. So no surprise that I've decided some time ago to integrate SnagIt into a few applications that require screen shots extensively. It's been a while since I've posted an update to my small SnagIt Windows Live Writer plug-in. There have been a few nagging issues that have crept up with recent changes in the way SnagIt handles captures in recent versions and they have been addressed in this update of SnagIt. Personally I love SnagIt and use it extensively mostly for blogging, but also for writing documentation and articles etc. While there are many other (and also free) tools out there to do basic screen captures, SnagIt continues to be the most convenient tool for me with its nice built in capture and effects editor that makes creating professional looking captures childishly simple. And maybe even more importantly: SnagIt has a COM interface that can be automated and  makes it super easy to embed into other applications. I've built plugins for SnagIt as well as for one of my company's own tools, Html Help Builder. If you use the Windows Live Writer offline WebLog Editor to write blog posts and have a copy of SnagIt it's probably worth your while to check this out if you haven't already. In case you haven't, this plugin integrates SnagIt with Live Writer so you can easily capture and edit content and embed it into a post. Captures are shown in the SnagIt Preview editor where you can edit the image and apply image markup or effects, before selecting Finish (or Cancel). The final image can then be pasted directly into your Live Writer post. When installed the SnagIt plug-in shows up on the PlugIn list or in the Plug-Ins toolbar shortcut: Once you select the Plug in you get the capture window that allows you to customize the capture process which includes most of the useful SnagIt capture options: Once you're done capturing the image shows up in the SnagIt Image Editor and you can crop, mark up and apply effects. When done you click the Finish button and the image is embedded right into your blog post. Easy - how do you think the images in this blog entry got in here? The beauty of SnagIt is that it's all easily integrated - Capturing, editing and embedding, it only takes a few seconds to do it all especially if you save image effect presets in SnagIt. What's updated The main issue addressed in this update has to do with the plug-in updates the Live Writer window. When a capture starts Live Writer gets minimized to get out of the way to let you pick your capture source. When the capture is complete and the image has been embedded Live Writer is activated once again. Recent versions of SnagIt however had changed the Window positioning of SnagIt so that Live Writer ended up popping up back behind the SnagIt window which was pretty annoying. This update pushes Live Writer back to the top of the window stack using some delaying tactics in the code. There have also been a few small changes to the way the code interacts with the COM object which is more reliable if a capture fails or SnagIt blows up or is locked because it's already in a capture outside of the automation interface. Source Code SnagIt Automation is something I actually use a lot. As mentioned I've integrated this automation into Live Writer as well as my documentation tool Html Help Builder, which I use just about daily. The SnagIt integration has a similar interface in that application and provides similar functionality. It's quite useful to integrate SnagIt into other applications. Because it's quite useful to embed SnagIt into other apps there's source code that you can download and embed into your own applications. The code includes both the dialog class that is automated from Live Writer, as well as the basic capture component that captures images to a disk file. Resources Download the SnagIt Capture Plug-in Installer An MSI installer that you can run that will install the plug-in into Live Writer's PlugIns directory. Source Code to the SnagIt Capture Plug-in Contains the plug-in assembly, as well as the source code to the plug-in and the setup project.© Rick Strahl, West Wind Technologies, 2005-2011Posted in Live Writer  WebLog   Tweet (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • HTML5 and CSS3 Editing in Windows Live Writer

    - by Rick Strahl
    Windows Live Writer is a wonderful tool for editing blog posts and getting them posted to your blog. What makes it nice is that it has a small set of useful features, plus a simple plug-in model that has spawned many useful add-ins. Small tool with a reasonably decent plug-in model to extend equals a great solution to a simple problem. If you're running Windows, have a blog and aren’t using Live Writer you’re probably doing it wrong…One of Live Writer’s nice features is that it can download your blog’s CSS for preview and edit displays. It lets you edit your content inside of the context of that CSS using the WYSIWYG editor, so your content actually looks very close to what you’ll see on your blog while you’re editing your post. Unfortunately Live Writer renders the HTML content in the Web Browser Control’s  default IE 7 rendering mode. Yeah you read that right: IE 7 is the default for the Web Browser control and most applications that use it, are stuck in this modus unless the application explicitly overrides this default. The Web Browser control does not use the version of Internet Explorer installed on the system (IE 10 on my Win8 machine) but uses IE 7 mode for ‘compatibility’ for old applications.If you are importing your blog’s CSS that may suck if you’re using rich HTML 5 and CSS 3 formatting. Hack the Registry to get Live Writer to render using IE 9 or 10In order to get Live Writer (or any other application that uses the Web Browser Control for that matter) to render you can apply a registry hack that overrides the Web Browser Control engine usage for a specific application. I wrote about this in detail in a previous blog post a couple of years back.Here’s how you can set up Windows Live Writer to render your CSS 3 by making a change in your registry:The above is for setup on a 64 bit machine, where I configure Live Writer which is a 32 bit application for using IE 10 rendering. The keys set are as follows:32bit Configuration on 64 bit machine:HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\Internet Explorer\Main\FeatureControl\FEATURE_BROWSER_EMULATIONKey: WindowsLiveWriter.exeValue: 9000 or 10000  (IE 9 or 10 respectively) (DWORD value)On a 32 bit only machine: HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Internet Explorer\Main\FeatureControl\FEATURE_BROWSER_EMULATIONKey: WindowsLiveWriter.exeValue: 9000 or 10000  (IE 9 or 10 respectively) (DWORD value)Use decimal values of 9000, 10000 or 11000 to specify specific versions of Internet Explorer. This is a minor tweak, but it’s nice to actually see my blog posts now with the proper CSS formatting intact. Notice the rounded borders and shadow on the code blocks as well as the overflow-x and scrollbars that show up. In this particular case I can see what the code blocks actually look like in a specific resolution – much better than in the old plain view which just chopped things off at the end of the window frame. There are a few other elements that now show properly in the editor as well including block quotes and note boxes that I occasionally use. It’s minor stuff, but it makes the editing experience better yet and closer to the final things so there are less republish operations than I previously had. Sweet!Note that this approach of putting an IE version override into the registry works with most applications that use the Web Browser control. If you are using the Web Browser control in your own applications, it’s a good idea to switch the browser to a more recent version so you can take advantage of HTML 5 and CSS 3 in your browser displayed content by automatically setting this flag in the registry or as part of the application’s startup routine if not dedicated setup tool is used. At the very least you might set it to 9000 (IE 9) which supports most of the basic CSS3 features and is a decent baseline that works for most Windows 7 and 8 machines. If running pre-IE9, the browser will fall back to IE7 rendering and look bad but at least more recent browsers will see an improved experience.I’m surprised that there aren’t more vendors and third party apps using this feature. You can see in my first screen shot that there are only very few entries in the registry key group on my machine – any other apps use the Web Browser control are using IE7. Go figure. Certainly Windows Live Writer should be writing this key into the registry automatically as part of installation to support this functionality out of the box, but alas since it does not, this registry hack lets you get your way anyway…Resources.reg Files to register Live Write Browser Emulation (set for IE9)Specifying Internet Explorer Version for ApplicationsSnagIt LiveWriter Plug-inDownload Windows Live WriterDownload Windows Live Writer with Chocolatey© Rick Strahl, West Wind Technologies, 2005-2013Posted in Live Writer  Windows   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Watch 53rd Grammy Awards 2011 Live Stream & Follow The Event On Facebook & Twitter

    - by Gopinath
    Grammy Awards is the biggest music awards ceremony that honours music genres. The 53rd Grammy Awards function will be held on February 13, 2011 at Staples Center in Los Angeles. The star studded musical event will be telecasted live on CBS TV between 8pm ET until 11:30pm ET. Behind The Scenes Live Stream On YouTube and Grammy.com Grammy, YouTube and TurboTax has teamed up to provide live stream of behind the scene coverage of key events starting at 5pm Feb 11th and runs through February 13th. Note that this stream will not broadcast the 3 hour long award presentation. Kind of bad, except the award presentation rest of the ceremony is streamed here.  The video stream is embedded below Catch live behind-the-scenes coverage of key events during GRAMMY week. Before the show, you can watch the Nominees Reception, Clive Davis Pre-GRAMMY Gala, the red carpet and the pre-telecast ceremony. During the show, go backstage to see the winners after they accept their award! You can catch the same stream on Grammys’s YouTube channel as well as on Grammy.com website Follow Grammy Awards On Facebook & Twitter The Grammy Awards has an official Facebook and Twitter account and you can follow them for all the latest information The Grammy’s Facebook Page The Grammy’s Twitter Page Grammy Live Streaming on UStream & Justin.tv Even though there is no official source that stream live of The Grammy Awards presentation ceremony, there will be plenty of unofficial sources on UStream and Justin.tv. So search UStream and Justin.tv for live streaming of The Grammy’s. TV Channels Telecasting The Grammy Awards 2011 Here is the list of TV channels in various countries offering live telecast of The Grammy Awards 2011. We keep updating this section as and when we get more information USA – CBS Television Network  – February 13, 2011 India – VH1 TV – February 14  2011 , 6:30 AM United Kingdom – ITV2  – 16 February 2011, 10:00PM – 12:00AM This article titled,Watch 53rd Grammy Awards 2011 Live Stream & Follow The Event On Facebook & Twitter, was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • How can I convert a 2D bitmap (Used for terrain) to a 2D polygon mesh for collision?

    - by Megadanxzero
    So I'm making an artillery type game, sort of similar to Worms with all the usual stuff like destructible terrain etc... and while I could use per-pixel collision that doesn't give me collision normals or anything like that. Converting it all to a mesh would also mean I could use an existing physics library, which would be better than anything I can make by myself. I've seen people mention doing this by using Marching Squares to get contours in the bitmap, but I can't find anything which mentions how to turn these into a mesh (Unless it refers to a 3D mesh with contour lines defining different heights, which is NOT what I want). At the moment I can get a basic Marching Squares contour which looks something like this (Where the grid-like lines in the background would be the Marching Squares 'cells'): That needs to be interpolated to get a smoother, more accurate result but that's the general idea. I had a couple ideas for how to turn this into a mesh, but many of them wouldn't work in certain cases, and the one which I thought would work perfectly has turned out to be very slow and I've not even finished it yet! Ideally I'd like whatever I end up using to be fast enough to do every frame for cases such as rapidly-firing weapons, or digging tools. I'm thinking there must be some kind of existing algorithm/technique for turning something like this into a mesh, but I can't seem to find anything. I've looked at some things like Delaunay Triangulation, but as far as I can tell that won't correctly handle concave shapes like the above example, and also wouldn't account for holes within the terrain. I'll go through the technique I came up with for comparison and I guess I'll see if anyone has a better idea. First of all interpolate the Marching Squares contour lines, creating vertices from the line ends, and getting vertices where lines cross cell edges (Important). Then, for each cell containing vertices create polygons by using 2 vertices, and a cell corner as the 3rd vertex (Probably the closest corner). Do this for each cell and I think you should have a mesh which accurately represents the original bitmap (Though there will only be polygons at the edges of the bitmap, and large filled in areas in between will be empty). The only problem with this is that it involves lopping through every pixel once for the initial Marching Squares, then looping through every cell (image height + 1 x image width + 1) at least twice, which ends up being really slow for any decently sized image...

    Read the article

  • Implementing an automatic navigation mesh generation for 2d top down map?

    - by J2V
    I am currently in the middle of implementing an A* pathfinding for enemies. In order to implement the actual A* logic, I need a navigation mesh for my map. I am working on a 2D top down rpg map. The world is static, meaning there is no requirement for dynamic runtime mesh generation. My world objects are pixel based, not tile based and have associated data with them such as scale, rotation, origin etc. I will obviously need some vertex data being generated from my world objects, maybe create a polygon generation from color data? I could create a colormap with objects for my whole map, but I have no idea how to begin creating nav mesh polygons. How would an actual navigation mesh generation look like with this kind of available information? Can anyone maybe point to some great resources? I have looked into some 3D nav mesh tools, but they seem kind of overly complex for my situation and also have a lot of their req data available from models. Thanks a lot in advance! I have been trying to get my head around it for some time now.

    Read the article

  • Live chat solutions

    - by Lèse majesté
    What good live chat/live help solutions are available (preferably for use on a site hosted on a LAMP stack and free)? I'm looking for a way to allow our sales and customer service reps to talk directly with visitors to our site. I've looked at phpopenchat, but it looks very unpolished. The only other free live chat app I've come across looked egregious. The aesthetics and UI design alone made me shudder to think what the underlying code might look like. This isn't a critical feature, and it wouldn't be hard to code up myself, so I'm not really looking for commercial software or paid services (unless there's a really compelling reason to use them). I'm just wondering if any other webmasters have come across a satisfactory free/open source solution for providing live customer support on their website. As a side note, live voice chat would also be an option, but it has to be be designed (or customizable) for customer support rather than a public chatroom. Edit: Looking at the responses, it looks like there probably aren't going to be many free solutions for this type of business-oriented chat solution, so feel free to post answers even if they are commercial solutions as long as they're a good value. Also feel free to post any alternate live support solutions (such as the Skype recommendation) that could be in someway integrated with a website. This will give me a good lay of the land for what people are actually using for live support, and I think will be more helpful to others reading this question.

    Read the article

  • How To Watch Live Streaming of Oscars 2011 (Academy Awards)

    - by Kavitha
    The Academy Awards or more popularly known as Oscars for this year will go live on Sunday,  February 27, 2011 (8PM ET/5pm PT) at the Kodak Theatre (Hollywood), Los Angeles, California. It’s a star studded event every movie lover wish to follow and watch live. We at Tech Dreams always love to write about live streaming of popular events happening across the globe. Here is our guide to follow Oscars 2011. Oscars 2011 Live Streams Last year we did not have many choices to view the Oscars online. But this year there are plenty of them available from the best of the media power houses APLive Oscars coverage on livestream.com (embedded below) Oscars.com – The Official Web Site of Academy Awards Oscars.org Live Streaming Academy Awards – Official Live Steaming Channel on livestream.com(embedded below) APLive Oscars coverage on Facebook Watch Oscars 2011 On Your iPad / iPhone You can catch Oscars 2011 on your iOS devices – iPhone, iPad and iPods for the time ever using the official oscar’s application. Application cost $0.99 and you can download it from AppStore Websites To View Highlights & Exclusive Clips Of Oscars 2010 If you miss to catch the live streaming of Oscars 2011, here are few sites you can check to view video highlights of the entire event.  Few websites like Hulu have access to exclusive moments. Oscar’s Official YouTube Channel Hulu Award Season 2011 coverage Oscar’s 2011 Event Schedule Oscars 2011 will begin at on 27th February Sunday 8PM EST in California. The local time in India will be around 9:30 AM on Monday. Here is list of major cities and the local time at which Oscars 2011 are going to start   Date & Time California February 27th, Sunday 20:00 Adelaide February 28th, Monday 14:30 Bangkok February 28th, Monday 11:00 Beijing February 28th, Monday 12:00 Brisbane February 28th, Monday 14:00 Cape Town February 28th, Monday 06:00 Dubai February 28th, Monday 08:00 Frankfurt February 28th, Monday 05:00 Hong Kong February 28th, Monday 12:00 Delhi/Chennai/Mumbai/Kolkata February 28th, Monday 09:30 New York February 27th, Sunday 23:00 Paris February 28th, Monday 05:00 Washington February 27th, Sunday 23:00 London February 28th, Monday 04:00 or more cities visit this link This article titled,How To Watch Live Streaming of Oscars 2011 (Academy Awards), was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • Usb Live No hardisk

    - by sergey simeonov
    Ok so i own a toshiba laptop , and the thing is my hdd went bad,but i had a usb flash drive with i live cd Ubuntu on it. Now i dont have and hdd plugged in my laptop im only with a flashdrive. My question is how much memory does this live cd support cause my flash drive is 32 gigs and after about 1.5 gigs of downloads a screen shows up and tells me that i dont have enough space left. the other thing is i dont have money to get a hdd right now so i wanna use this flash drive for now but can i somehow customize the cd thru the live edition so i could have programs that i need after restarts i tried with Customization Live Cd but when i run it , it stops and in my terminal it says could not execute not enough space. So if anyone can tell me what can i do with this problem im having with space on my live Usb ubuntu.

    Read the article

  • Models with more than one mesh in JMonkeyEngine

    - by Andrea Tucci
    I’m a new jmonkey engine developer and I’m beginning to import models. I tried to import simple models and no problems appeared, but when I export some obj models having more than one mesh in the OgreXML format, Blender saves multiple meshes with their own materials (e.g. one mesh for face, another for body etc). Can I export all the meshes in one? I’ve tried to join all the meshes to a major one with blender (face joins body), but when I export the model and then create the Spatial in jme(loading the path of the “merged” mesh), all the meshes that are joined to the major doesn’t have their materials! I give a more clear example: I have an .obj model with 3 meshes and I export it. I have : mesh1.mesh.xml , mesh2.mesh.xml , mesh3.mesh.xml and their materials mesh1.material, mesh2.material mesh3.material so I import the folder in Assets/Models/Test and now I have to create something like: Spatial head = assetManager.loadModel( [path] ); Spatial face = assetManager.loadModel( [path] ) one for each mesh and than attach them to a common node. I think there is a way to merge those mesh maintaining their materials! What do you think? Thanks

    Read the article

  • Live Mesh starts exactly once on fresh Win7 Ultimate installation

    - by Reb.Cabin
    I did a fresh install of Windows 7 Ulimate 64-bit on a formatted drive on a refurbed Lenovo PC, applied all 102 (!) windows updates, windows seems to be working fine. No quirks installing, no apps, no junkware, just straight, legal, Win7 Ultimate right from an unopened 2009 Microsoft box. Ok, breathe sigh -- Install Live Mesh (no messenger, no mail, no writer, no photo, none of the rest of the Windows Live freeware). Set up my shares, let it run overnight. watch MOE.exe in the Task-Manager perf pane to make sure it's all settled down. reboot. Ok, check that MOE is running and files are getting updated properly from other machines in the mesh. Great. HOWEVER -- when I try to launch Windows Live Mesh app from the Start jewel, I get a brief hourglass, then nothing. Reboot. Same story. result -- the shares I already posted seem to be synching properly, but I can't run the app, so I can't add and delete shares. The background process MOE seems to run, but I can't get the app going. btw, the reason I did this fresh install is I had exactly the same experience running Vista, so I wiped the machine hoping it would solve this nasty problem. Imagine my surprise! Will be grateful for clues, advice, etc, please & thanks!

    Read the article

  • [Windows Live Messenger] Beta sounds

    - by sinni800
    Hello, in a beta version of the Windows Live Messenger they had different sounds once. They weren't like the current ones, they sounded brighter. The normal "dling" when logging in was replaced by a more direct "DIING!". It was only like that in one beta version thought. I was already searching for it when it was replaced back again, but I lost the exe file back then. Anyone know of this? Anyone else? Please!

    Read the article

  • (Unity)Getting a mirrored mesh from my data structure

    - by Steve
    Here's the background: I'm in the beginning stages of an RTS game in Unity. I have a procedurally generated terrain with a perlin-noise height map, as well as a function to generate a river. The problem is that the graphical creation of the map is taking the data structure of the map and rotating it by 180 degrees. I noticed this problem when i was creating my rivers. I would set the River's height to flat, and noticed that the actual tiles that were flat in the graphical representation were flipped and mirrored. Here's 3 screenshots of the map from different angles: http://imgur.com/a/VLHHq As you can see, if you flipped (graphically) the river by 180 degrees on the z axis, it would fit where the terrain is flattened. I have a suspicion it is being caused by a misunderstanding on my part of how vertices work. Alas, here is a snippet of the code that is used: This code here creates a new array of Tile objects, which hold the information for each tile, including its type, coordinate, height, and it's 4 vertices public DTileMap (int size_x, int size_y) { this.size_x = size_x; this.size_y = size_y; //Initialize Map_Data Array of Tile Objects map_data = new Tile[size_x, size_y]; for (int j = 0; j < size_y; j++) { for (int i = 0; i < size_x; i++) { map_data [i, j] = new Tile (); map_data[i,j].coordinate.x = (int)i; map_data[i,j].coordinate.y = (int)j; map_data[i,j].vertices[0] = new Vector3 (i * GTileMap.TileMap.tileSize, map_data[i,j].Height, -j * GTileMap.TileMap.tileSize); map_data[i,j].vertices[1] = new Vector3 ((i+1) * GTileMap.TileMap.tileSize, map_data[i,j].Height, -(j) * GTileMap.TileMap.tileSize); map_data[i,j].vertices[2] = new Vector3 (i * GTileMap.TileMap.tileSize, map_data[i,j].Height, -(j-1) * GTileMap.TileMap.tileSize); map_data[i,j].vertices[3] = new Vector3 ((i+1) * GTileMap.TileMap.tileSize, map_data[i,j].Height, -(j-1) * GTileMap.TileMap.tileSize); } } This code sets the river tiles to height 0 foreach (Tile t in map_data) { if (t.realType == "Water") { t.vertices[0].y = 0f; t.vertices[1].y = 0f; t.vertices[2].y = 0f; t.vertices[3].y = 0f; } } And below is the code to generate the actual graphics from the data: public void BuildMesh () { DTileMap.DTileMap map = new DTileMap.DTileMap (size_x, size_z); int numTiles = size_x * size_z; int numTris = numTiles * 2; int vsize_x = size_x + 1; int vsize_z = size_z + 1; int numVerts = vsize_x * vsize_z; // Generate the mesh data Vector3[] vertices = new Vector3[ numVerts ]; Vector3[] normals = new Vector3[numVerts]; Vector2[] uv = new Vector2[numVerts]; int[] triangles = new int[ numTris * 3 ]; int x, z; for (z=0; z < vsize_z; z++) { for (x=0; x < vsize_x; x++) { normals [z * vsize_x + x] = Vector3.up; uv [z * vsize_x + x] = new Vector2 ((float)x / size_x, 1f - (float)z / size_z); } } for (z=0; z < vsize_z; z+=1) { for (x=0; x < vsize_x; x+=1) { if (x == vsize_x - 1 && z == vsize_z - 1) { vertices [z * vsize_x + x] = DTileMap.DTileMap.map_data [x - 1, z - 1].vertices [3]; } else if (z == vsize_z - 1) { vertices [z * vsize_x + x] = DTileMap.DTileMap.map_data [x, z - 1].vertices [2]; } else if (x == vsize_x - 1) { vertices [z * vsize_x + x] = DTileMap.DTileMap.map_data [x - 1, z].vertices [1]; } else { vertices [z * vsize_x + x] = DTileMap.DTileMap.map_data [x, z].vertices [0]; vertices [z * vsize_x + x+1] = DTileMap.DTileMap.map_data [x, z].vertices [1]; vertices [(z+1) * vsize_x + x] = DTileMap.DTileMap.map_data [x, z].vertices [2]; vertices [(z+1) * vsize_x + x+1] = DTileMap.DTileMap.map_data [x, z].vertices [3]; } } } } for (z=0; z < size_z; z++) { for (x=0; x < size_x; x++) { int squareIndex = z * size_x + x; int triOffset = squareIndex * 6; triangles [triOffset + 0] = z * vsize_x + x + 0; triangles [triOffset + 2] = z * vsize_x + x + vsize_x + 0; triangles [triOffset + 1] = z * vsize_x + x + vsize_x + 1; triangles [triOffset + 3] = z * vsize_x + x + 0; triangles [triOffset + 5] = z * vsize_x + x + vsize_x + 1; triangles [triOffset + 4] = z * vsize_x + x + 1; } } // Create a new Mesh and populate with the data Mesh mesh = new Mesh (); mesh.vertices = vertices; mesh.triangles = triangles; mesh.normals = normals; mesh.uv = uv; // Assign our mesh to our filter/renderer/collider MeshFilter mesh_filter = GetComponent<MeshFilter> (); MeshCollider mesh_collider = GetComponent<MeshCollider> (); mesh_filter.mesh = mesh; mesh_collider.sharedMesh = mesh; calculateMeshTangents (mesh); BuildTexture (map); } If this looks familiar to you, its because i got most of it from Quill18. I've been slowly adapting it for my uses. And please include any suggestions you have for my code. I'm still in the very early prototyping stage.

    Read the article

  • Getting a mirrored mesh from my data structure

    - by Steve
    Here's the background: I'm in the beginning stages of an RTS game in Unity. I have a procedurally generated terrain with a perlin-noise height map, as well as a function to generate a river. The problem is that the graphical creation of the map is taking the data structure of the map and rotating it by 180 degrees. I noticed this problem when I was creating my rivers. I would set the River's height to flat, and noticed that the actual tiles that were flat in the graphical representation were flipped and mirrored. Here's 3 screenshots of the map from different angles: http://imgur.com/a/VLHHq As you can see, if you flipped (graphically) the river by 180 degrees on the z axis, it would fit where the terrain is flattened. I have a suspicion it is being caused by a misunderstanding on my part of how vertices work. Alas, here is a snippet of the code that is used: This code here creates a new array of Tile objects, which hold the information for each tile, including its type, coordinate, height, and it's 4 vertices public DTileMap (int size_x, int size_y) { this.size_x = size_x; this.size_y = size_y; //Initialize Map_Data Array of Tile Objects map_data = new Tile[size_x, size_y]; for (int j = 0; j < size_y; j++) { for (int i = 0; i < size_x; i++) { map_data [i, j] = new Tile (); map_data[i,j].coordinate.x = (int)i; map_data[i,j].coordinate.y = (int)j; map_data[i,j].vertices[0] = new Vector3 (i * GTileMap.TileMap.tileSize, map_data[i,j].Height, -j * GTileMap.TileMap.tileSize); map_data[i,j].vertices[1] = new Vector3 ((i+1) * GTileMap.TileMap.tileSize, map_data[i,j].Height, -(j) * GTileMap.TileMap.tileSize); map_data[i,j].vertices[2] = new Vector3 (i * GTileMap.TileMap.tileSize, map_data[i,j].Height, -(j-1) * GTileMap.TileMap.tileSize); map_data[i,j].vertices[3] = new Vector3 ((i+1) * GTileMap.TileMap.tileSize, map_data[i,j].Height, -(j-1) * GTileMap.TileMap.tileSize); } } This code sets the river tiles to height 0 foreach (Tile t in map_data) { if (t.realType == "Water") { t.vertices[0].y = 0f; t.vertices[1].y = 0f; t.vertices[2].y = 0f; t.vertices[3].y = 0f; } } And below is the code to generate the actual graphics from the data: public void BuildMesh () { DTileMap.DTileMap map = new DTileMap.DTileMap (size_x, size_z); int numTiles = size_x * size_z; int numTris = numTiles * 2; int vsize_x = size_x + 1; int vsize_z = size_z + 1; int numVerts = vsize_x * vsize_z; // Generate the mesh data Vector3[] vertices = new Vector3[ numVerts ]; Vector3[] normals = new Vector3[numVerts]; Vector2[] uv = new Vector2[numVerts]; int[] triangles = new int[ numTris * 3 ]; int x, z; for (z=0; z < vsize_z; z++) { for (x=0; x < vsize_x; x++) { normals [z * vsize_x + x] = Vector3.up; uv [z * vsize_x + x] = new Vector2 ((float)x / size_x, 1f - (float)z / size_z); } } for (z=0; z < vsize_z; z+=1) { for (x=0; x < vsize_x; x+=1) { if (x == vsize_x - 1 && z == vsize_z - 1) { vertices [z * vsize_x + x] = DTileMap.DTileMap.map_data [x - 1, z - 1].vertices [3]; } else if (z == vsize_z - 1) { vertices [z * vsize_x + x] = DTileMap.DTileMap.map_data [x, z - 1].vertices [2]; } else if (x == vsize_x - 1) { vertices [z * vsize_x + x] = DTileMap.DTileMap.map_data [x - 1, z].vertices [1]; } else { vertices [z * vsize_x + x] = DTileMap.DTileMap.map_data [x, z].vertices [0]; vertices [z * vsize_x + x+1] = DTileMap.DTileMap.map_data [x, z].vertices [1]; vertices [(z+1) * vsize_x + x] = DTileMap.DTileMap.map_data [x, z].vertices [2]; vertices [(z+1) * vsize_x + x+1] = DTileMap.DTileMap.map_data [x, z].vertices [3]; } } } } for (z=0; z < size_z; z++) { for (x=0; x < size_x; x++) { int squareIndex = z * size_x + x; int triOffset = squareIndex * 6; triangles [triOffset + 0] = z * vsize_x + x + 0; triangles [triOffset + 2] = z * vsize_x + x + vsize_x + 0; triangles [triOffset + 1] = z * vsize_x + x + vsize_x + 1; triangles [triOffset + 3] = z * vsize_x + x + 0; triangles [triOffset + 5] = z * vsize_x + x + vsize_x + 1; triangles [triOffset + 4] = z * vsize_x + x + 1; } } // Create a new Mesh and populate with the data Mesh mesh = new Mesh (); mesh.vertices = vertices; mesh.triangles = triangles; mesh.normals = normals; mesh.uv = uv; // Assign our mesh to our filter/renderer/collider MeshFilter mesh_filter = GetComponent<MeshFilter> (); MeshCollider mesh_collider = GetComponent<MeshCollider> (); mesh_filter.mesh = mesh; mesh_collider.sharedMesh = mesh; calculateMeshTangents (mesh); BuildTexture (map); } If this looks familiar to you, its because i got most of it from Quill18. I've been slowly adapting it for my uses. And please include any suggestions you have for my code. I'm still in the very early prototyping stage.

    Read the article

  • Why is mesh baking causing huge performance spikes?

    - by jellyfication
    A couple of seconds into the gameplay on my Android device, I see huge performance spikes caused by "Mesh.Bake Scaled Mesh PhysX CollisionData" In my game, a whole level is a parent object containing multiple ridigbodies with mesh colliders. Every FixedUpdate(), my parent object rotates around the player. Rotating the world causes mesh scaling. Here is the code that handles world rotation. private void Update() { input.update(); Vector3 currentInput = input.GetDirection(); worldParent.rotation = initialRotation; worldParent.DetachChildren(); worldParent.position = transform.position; world.parent = worldParent; worldParent.Rotate(Vector3.right, currentInput.x * 50f); worldParent.Rotate(Vector3.forward, currentInput.z * 50f); } How can I get rid of mesh scaling ? Mesh.Bake physx seems to take effect after some time, is it possible to disable this function ? The profiler looks like this: Bottom-left panel shows data before spikes, the right after

    Read the article

  • Watch Indian TV Channels Live On Apple iPad and iPhone

    - by Kavitha
    After having your Apple iPad or iPhone with you, are you boring with your journey? Don’t worry now with the help of a small application called "YuppTV" you can watch Live Indian TV Channels free of cost on your journey. The Application can be directly downloaded from the App Store. On launching the application you will find a list of TV channels that are available for live streaming – few of popular channels available through the app are: India Tv, 9XM, ABN Andhra Jyothi, DD Vyas, eTV2, HMTV, Maa Tv Telugu, NewX, NTv, RK News, Sakshi TV etc. Just tap on any of the channel in the list to view live feed of the TV channel. Download YuppTV App From App Store This article titled,Watch Indian TV Channels Live On Apple iPad and iPhone, was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • How can I generate a 2d navigation mesh in a dynamic environment at runtime?

    - by Stephen
    So I've grasped how to use A* for path-finding, and I am able to use it on a grid. However, my game world is huge and I have many enemies moving toward the player, which is a moving target, so a grid system is too slow for path-finding. I need to simplify my node graph by using a navigational mesh. I grasp the concept of "how" a mesh works (finding a path through nodes on the vertices and/or the centers of the edges of polygons). My game uses dynamic obstacles that are procedurally generated at run-time. I can't quite wrap my head around how to take a plane that has multiple obstacles in it and programatically divide the walkable area up into polygons for the navigation mesh, like the following image. Where do I start? How do I know when a segment of walk-able area is already defined, or worse, when I realize I need to subdivide a previously defined walk-able area as the algorithm "walks" through the map? I'm using javascript in nodejs, if it matters.

    Read the article

  • How to create an Ubuntu 12.10 live CD?

    - by B Biswas
    I downloaded the Ubuntu 12.10 installer from Ubuntu website. However, I find that it is not an iso image and I am unable to create a live CD (or DVD) from it. I could not find any help from Ubuntu website as well as internet. Please help. PS - My OS is Windows XP. The Ubuntu installer I downloaded from Ubuntu website is a zip file. I unzipped the file and it has a wubi file. PS - Thanks. I could create a Live CD. 1) First I tried to do it in my laptop which has Win 7. It was showing the Ubuntu installer as a zip file and could not able to burn it in to a DVD. At that time I raised the question. 2) Later I copied the installer in my desktop which has Win XP. There the installer is shown as an ISO file and I burnt it in to a DVD and created the Live CD. This is working nicely in the the desktop. 3) I tried to run the Live Cd in my Laptop which is an AMD machine, the system does not boot up. 4) In my office desktop which has Win 7 the Ubuntu installer is showing as an ISO file. My questions are as follows: A) Why the Ubuntu installer file is showing differently in different machines? B) Why the Live CD is not working in my Laptop?

    Read the article

  • (initramfs) unable to find a live medium containing a live file system Toshiba

    - by Filkor
    When I try to boot from my USB it says: (initramfs) unable to find a live medium containing a live file system I know of lots of questions here about this problem... I've tried every solution but nothing worked. I just ask if anyone here who got a Toshiba Sattelite Pro c650 and found the solution, because I simply cant install Ubuntu 10.04 because of this :((( Other spec: - I have USB 2.0 - Changed all possible settings in BIOS. (AHCI..) - ISO is not corrupted (boot is succesful on my other HP Laptop) Thanks. Edit: But XUbuntu works :/

    Read the article

  • How to test other DE in Ubuntu 12.10 live (alpha)

    - by gsedej
    caution: 12.10 is not yet released but I was told it will say as is also when release happens (live session) So, new ubuntu live session does not have option to "logout" and choose different desktop environment (DE). This function was usable if one installs ubuntu live on USB stick with permanent changes. One can install any software, including KDE (plasma) or LXDE. Until including 12.04, one could simply logout and choose different DE. Now, there is no "logout" option in top right menu and if ran service lightdm restart it automatically logs in to live session account with unity.

    Read the article

  • How do I draw a scene with 2 nested frames

    - by Guido Granobles
    I have been trying for long time to figure out this: I have loaded a model from a directx file (I am using opengl and Java) the model have a hierarchical system of nested reference frames (there are not bones). There are just 2 frames, one of them is called x3ds_Torso and it has a child frame called x3ds_Arm_01. Each one of them has a mesh. The thing is that I can't draw the arm connected to the body. Sometimes the body is in the center of the screen and the arm is at the top. Sometimes they are both in the center. I know that I have to multiply the matrix transformation of every frame by its parent frame starting from the top to the bottom and after that I have to multiply every vertex of every mesh by its final transformation matrix. So I have this: public void calculeFinalMatrixPosition(Bone boneParent, Bone bone) { System.out.println("-->" + bone.name); if (boneParent != null) { bone.matrixCombined = bone.matrixTransform.multiply(boneParent.matrixCombined); } else { bone.matrixCombined = bone.matrixTransform; } bone.matrixFinal = bone.matrixCombined; for (Bone childBone : bone.boneChilds) { calculeFinalMatrixPosition(bone, childBone); } } Then I have to multiply every vertex of the mesh: public void transformVertex(Bone bone) { for (Iterator<Mesh> iterator = meshes.iterator(); iterator.hasNext();) { Mesh mesh = iterator.next(); if (mesh.boneName.equals(bone.name)) { float[] vertex = new float[4]; double[] newVertex = new double[3]; if (mesh.skinnedVertexBuffer == null) { mesh.skinnedVertexBuffer = new FloatDataBuffer( mesh.numVertices, 3); } mesh.vertexBuffer.buffer.rewind(); while (mesh.vertexBuffer.buffer.hasRemaining()) { vertex[0] = mesh.vertexBuffer.buffer.get(); vertex[1] = mesh.vertexBuffer.buffer.get(); vertex[2] = mesh.vertexBuffer.buffer.get(); vertex[3] = 1; newVertex = bone.matrixFinal.transpose().multiply(vertex); mesh.skinnedVertexBuffer.buffer.put(((float) newVertex[0])); mesh.skinnedVertexBuffer.buffer.put(((float) newVertex[1])); mesh.skinnedVertexBuffer.buffer.put(((float) newVertex[2])); } mesh.vertexBuffer = new FloatDataBuffer( mesh.numVertices, 3); mesh.skinnedVertexBuffer.buffer.rewind(); mesh.vertexBuffer.buffer.put(mesh.skinnedVertexBuffer.buffer); } } for (Bone childBone : bone.boneChilds) { transformVertex(childBone); } } I know this is not the more efficient code but by now I just want to understand exactly how a hierarchical model is organized and how I can draw it on the screen. Thanks in advance for your help.

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >