Search Results

Search found 5873 results on 235 pages for 'raster graphics'.

Page 186/235 | < Previous Page | 182 183 184 185 186 187 188 189 190 191 192 193  | Next Page >

  • Laptop runs HOT after 12.10 upgrade!

    - by dinkelk
    I was running 12.04 for 6 months, my laptop ran almost silently and cool enough to hold on my lap. I updated to 12.10 and now my computer gets too hot to hold on my lap and the fan is constantly running on full blast. This is the output of sensors: acpitz-virtual-0 Adapter: Virtual device temp1: +84.0°C (crit = +99.0°C) coretemp-isa-0000 Adapter: ISA adapter Physical id 0: +84.0°C (high = +86.0°C, crit = +100.0°C) Core 0: +74.0°C (high = +86.0°C, crit = +100.0°C) Core 1: +72.0°C (high = +86.0°C, crit = +100.0°C) Core 2: +75.0°C (high = +86.0°C, crit = +100.0°C) Core 3: +84.0°C (high = +86.0°C, crit = +100.0°C) radeon-pci-0100 Adapter: PCI adapter temp1: +76.0°C I have an HP Pavilion dv6, i7, amd radeon graphics. Please let me know if you need additional information. What could be different between the two Ubuntu additions that caused such a drastic change? Edit 1: Per @Paul's suggestion, I ran htop to try to narrow down the problem. Here is the result! This is about 10 minutes after boot-up, htop, yakuake, and a chrome page with 1 tab opened to this question are all that I have manually opened. The most taxing program to the CPU is htop itself. I think that the problem must lie elsewhere; my temps are already up to ~65C for the CPU and ~69C for the GPU, with nearly 0% CPU usage.

    Read the article

  • Automatically revert to laptop screen when external monitor unplugged

    - by Ryan
    I regularly use an external monitor with my laptop, so when I use it, I usually have the laptop screen disabled when the monitor is connected, and this seems to cause problems when the monitor is disconnected. If the monitor is connected while the laptop screen is disabled, I can't get the X session to show up at all: I can Ctrl+Alt+F1 to open a terminal, and that works fine.. ..but Ctrl+Alt+F7 does nothing. The display is blank, and stays blank. The same thing happens whether I put the computer to sleep with the monitor connected, or if I disconnect while the computer is still awake. Rebooting the computer fixes the issue, as does killing Xorg and starting it again, but both of those are sub-optimal since I lose my current session. I'm currently using the open source graphics driver (xserver-xorg-video-ati). This question looks like it might answer my question, but unfortunately hwinfo is no longer available in the apt repository. Is there a way with current tools to automatically detect when the external monitor is disconnected and switch to the laptop display?

    Read the article

  • Ubuntu Server 11.10 boot, white terminal with garbled black text

    - by SpeedCrazy
    I just installed Ubuntu server 11.10 and the install went fine. This system is running on an Intel Pentium II board with onboard graphics. However when I try to boot into Ubuntu I get a white terminal with garbled black text. I have tried various grub 'fixes' as googling the issue seemed to suggest it was a res or grub related issue. I cannot ssh in so the issue does affect Linux as well. I have had no luck with anything thus far and am at my wits end. This was my first Ubuntu excursion as my friend told me it was better for servers than CentOS because it was easier... Not so much.... Does anyone have any ideas as to what the issue could be? When answering bear in mind I am an Ubuntu noob and Linux novice. As of 1/26/12 I have tried to add the console=ttyl line to the /etc/default/grub and run update-grub. This results in the line in the boot parameters that normally reads: linux /vmlunz-3.0.0-12-generic-pae root=/dev/mapper/dev-root rovt.handoff=7 now reads: linux /vmlunz-3.0.0-12-generic-pae root=/dev/mapper/dev-root ro console=ttyl vt.handoff=7 This does not work. Is there anyway to have console=ttyl inserted on a line by itself? I am at my wits ends, Thanks for all your help, Speed

    Read the article

  • Almost working 2D Collisions

    - by TheGag96
    I'm terribly sorry I'm asking this question YET AGAIN, but I can almost guarantee that this will be the last time I'll have to ask. I'm currently on the verge of FINALLY getting these collisions to work for my game, made with libGDX in Java. My collisions use the same method as (and are basically copied and modified code from) the XNA Platformer example (here) where the direction of the collision is based on the rectangle where two objects are overlapping. The collisions themselves almost work perfectly, but for some reason, holding down/up and left and colliding with the floor/ceiling while doing so doesn't seem to work well. I'm not at all sure why. Instead of vaguely giving my problem and snippets of code, I've decided to instead provide a binary and the source of the game I have so far so you can see for yourself what my problem is. Link. (Note: make sure you unzip everything into a folder somewhere or it will not work) You'll find the collision code in the method workingCollisions() in Link.java. Please excuse the messy code and terrible graphics as this whole thing is in pre-pre-alpha. If anyone is kind enough and helps me out here, you are the best person ever. I'm completely desperate; I've been trying this on and off for months and I just can't get it to work. I cannot thank you enough.

    Read the article

  • Disappearring instances of VertexPositionColor using MonoGame

    - by Rosko
    I am a complete beginner in graphics developing with XNA/Monogame. Started my own project using Monogame 3.0 for WinRT. I have this unexplainable issue that some of the vertices disappear while doing some updates on them. Basically, it is a game with balls who collide with the walls and with each other and in certain conditions they explode. When they explode they disappear. Here is a video demonstrating the issue. I used wireframes so that it is easier to see how vertices are missing. The perfect exploding balls are the ones which are result by user input with mouse clicking. Thanks for the help. The situations is: I draw user primitives with triangle strips using like this graphicsDevice.DrawUserPrimitives<VertexPositionColor>(PrimitiveType.TriangleStrip, circleVertices, 0, primitiveCount); All of the primitives are in the z-plane (z = 0), I thought that it is the culling in action. I tried setting the culling mode to none but it did not help. Here is the code responsible for the explosion private void Explode(GameTime gameTime, ref List<Circle> circles) { if (this.isExploding) { for (int i = 0; i < this.circleVertices.Length; i++) { if (this.circleVertices[i] != this.circleCenter) { if (Vector3.Distance(this.circleVertices[i].Position, this.circleCenter.Position) < this.explosionRadius * precisionCoefficient) { var explosionVector = this.circleVertices[i].Position - this.circleCenter.Position; explosionVector.Normalize(); explosionVector *= explosionSpeed; circleVertices[i].Position += explosionVector * (float)gameTime.ElapsedGameTime.TotalSeconds; } else { circles.Remove(this); } } } } } I'd be really greatful if anyone has suggestions about how to fix this issue.

    Read the article

  • Extremely simple online multiplayer game

    - by Postscripter
    I am considering creating a simple multiplayer game, which focuses on physics and can accommodate up to 30 players per session. Very simple graphics, but smart physics (pushing, weight and gravity, balance) is required. After some research I found a good java script (framework ??) called box2d.js I found the demo to be excellent. this is is kind of physics am looking for in my game. Now, what other frameworks will I need? Node.js?? Prototype.js?? (btw, I found the latest versoin of protoype.js to be released in 2010...?? is this still supported? Should I avoid using it?) What bout HTML 5 and Canvas? would I need them? websockets? Am a beginner in web programming + game programming world. but I will learn fast, am computer science graduate. (but no much web expeience but know essentionals javascript, html, css..). I just need a guiding path to build my game. Thanks

    Read the article

  • How to help FGLRX detect a device

    - by user113416
    I have HD 4850 card, Ubuntu 12.10 and installed legacy drivers using makson96 ppa. The issue is, that FGLRX can not detect my device and loads vesa bios. I had the same problem on ubuntu 11.10, 12.04 versions. I want to manually help fglrx find a matching device to load as it shoudld do. It is interesting, why does fglrx search for a device in a PCI:0@1:0:1 Bus? in xorg.cof different bus is indicated: Section "Device" Identifier "aticonfig-Device[0]-0" Driver "fglrx" BusID "PCI:1:0:0" EndSection fglrxinfo display: :0.0 screen: 0 OpenGL vendor string: Advanced Micro Devices, Inc. OpenGL renderer string: ATI Radeon HD 4800 Series OpenGL version string: 3.3.11653 Compatibility Profile Context Here is a part of my xorg log: [ 3.846] (II) VESA: driver for VESA chipsets: vesa [ 3.846] (II) FBDEV: driver for framebuffer: fbdev [ 3.846] (++) using VT number 7 [ 3.846] (WW) Falling back to old probe method for fglrx [ 3.883] (II) Loading PCS database from /etc/ati/amdpcsdb [ 3.883] (--) Assigning device section with no busID to primary device [ 3.883] (--) Chipset Supported AMD Graphics Processor (0x9442) found [ 3.884] (WW) fglrx: No matching Device section for instance (BusID PCI:0@1:0:1) found [ 3.884] (II) AMD Video driver is running on a device belonging to a group targeted for this release [ 3.884] (II) AMD Video driver is signed [ 3.884] (II) fglrx(0): pEnt->device->identifier=0xb7791d8f [ 3.884] (WW) Falling back to old probe method for vesa [ 3.884] (WW) Falling back to old probe method for fbdev Thanks in advance.

    Read the article

  • Thunderbird keeps crashing Ubuntu 12.04 64 bit

    - by maurizio ribera d'alcala'
    I have been using Thunderbird for years under different versions of Ubuntu, including 12.04 since its release. From a couple of days it keeps crashing after a few seconds after start. I tried to reinstall it creating a new profile and copying the old mail, file by file. After one day of normal functioning it started again to crash. Mozilla is receiving the crash reports. Following is the content of the last one: Add-ons: [email protected]:15.0,[email protected]:0.3.11,[email protected]:0.9.3,[email protected]:3.4.1,[email protected]:15.0,[email protected]:0.5.1,{b4447f60-db9c-11da-a94d-0800200c9a66}:0.9.1,{972ce4c6-7e08-4474-a285-3208198ce6fd}:15.0 BuildID: 20120827103657 CrashTime: 1347200254 EMCheckCompatibility: true Email: [email protected] FramePoisonBase: 7ffffffff0dea000 FramePoisonSize: 4096 InstallTime: 1346431480 Notes: OpenGL: Tungsten Graphics, Inc -- Mesa DRI Intel(R) Sandybridge Mobile -- 3.0 Mesa 8.0.2 -- texture_from_pixmap ProductID: {3550f703-e582-4d05-9a08-453d09bdfdc6} ProductName: Thunderbird ReleaseChannel: release SecondsSinceLastCrash: 1109 StartupTime: 1347200242 Theme: classic/1.0 Throttleable: 1 Vendor: Version: 15.0 I started Thunderbird in safe mode and tested all the add-ons. Apparently is the 'Unity Launcher Integration' that creates the problem. I say apparently because I want two wait for two-three days to be sure TB returns to its regular functioning. Is this a bug? can it be solved

    Read the article

  • Why does Unity not extend to my 2nd monitor, even when it is displaying an X-Screen?

    - by Gridwalker
    I recently added a 2nd video card to my system, but unity refuses to extend my desktop over to the second screen. Although the secondary monitor initialises when I boot and I can move the mouse cursor over to the 2nd screen, the screen is otherwise blank (showing no wallpaper or interface elements) and I am unable to move any windows to this monitor. Moving the mouse cursor over to the 2nd monitor changes it from the default cursor to the old-style X cursor, such as the one that appears when you run X-kill, indicating that this screen is initialised in the X Server but that Unity is not recognising it. Although the Nvidia X Server Settings application can see both monitors, the unity systems settings application does not detect the 2nd adapter. Sometimes the additional drivers application can see both adapters, but it doesn't consistently show options for them both. Xrandr also fails to detect the 2nd monitor, but iNex lists both adapters. I have experimented with several different drivers for each adapter and with setting each of the graphics cards as the primary adapter in the BIOS, but this has made little difference. The two adapters are an onboard Geforce 8200 and a PCIE Geforce 7200 GX. The onboard adapter is currently set as the primary, however this adapter crashes whenever I use the Nouveau driver and I have to switch over to the PCIE as a primary whenever I purge the proprietary drivers (switching back when the 304 driver has been reinstalled). It doesn't matter which adapter I set as my primary, the results are the same : one screen showing the unity interface and one screen showing an X-Screen that only displays the mouse cursor. All I want is to be able to run this system in a dual screen configuration. I am not a gamer, nor do I require 3D rendering capabilities. Anything you can suggest to get the desktop to extend across both screens will be massively appreciated!

    Read the article

  • Moving while doing loop animation in RPGMaker

    - by AzDesign
    I made a custom class to display character portrait in RPGMaker XP Here is the class : class Poser attr_accessor :images def initialize @images = Sprite.new @images.bitmap = RPG::Cache.picture('Character.png') #100x300 @images.x = 540 #place it on the bottom right corner of the screen @images.y = 180 end end Create an event on the map to create an instance as global variable, tada! image popped out. Ok nice. But Im not satisfied, Now I want it to have bobbing-head animation-like (just like when we breathe, sometimes bob our head up and down) so I added another method : def move(x,y) @images.x += x @images.y += y end def animate(x,y,step,delay) forward = true 2.times { step.times { wait(delay) if forward move(x/step,y/step) else move(-x/step,-y/step) end } wait(delay*3) forward = false } end def wait(time) while time > 0 time -= 1 Graphics.update end end I called the method to run the animation and it works, so far so good, but the problem is, WHILE the portrait goes up and down, I cannot move my character until the animation is finished. So that's it, I'm stuck in the loop block, what I want is to watch the portrait moving up and down while I walk around village, talk to npc, etc. Anyone can solve my problem ? Or better solution ? Thanks in advance

    Read the article

  • LIfebook lh531 inbuilt card reader not working

    - by chandrasekar
    Inbuilt card reader (SD/PRO/SDHC) not working. When I insert the memory card the indicator comes for a milli second and nothing happens. When I do lspci it gives the out put which is pasted below: I use Ubuntu 11.10. Pl help pro-hq@prohq-LIFEBOOK-LH531:~$ lspci 00:00.0 Host bridge: Intel Corporation 2nd Generation Core Processor Family DRAM Controller (rev 09) 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 00:16.0 Communication controller: Intel Corporation 6 Series/C200 Series Chipset Family MEI Controller #1 (rev 04) 00:1a.0 USB Controller: Intel Corporation 6 Series/C200 Series Chipset Family USB Enhanced Host Controller #2 (rev 05) 00:1b.0 Audio device: Intel Corporation 6 Series/C200 Series Chipset Family High Definition Audio Controller (rev 05) 00:1c.0 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 1 (rev b5) 00:1c.2 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 3 (rev b5) 00:1d.0 USB Controller: Intel Corporation 6 Series/C200 Series Chipset Family USB Enhanced Host Controller #1 (rev 05) 00:1f.0 ISA bridge: Intel Corporation HM65 Express Chipset Family LPC Controller (rev 05) 00:1f.2 SATA controller: Intel Corporation 6 Series/C200 Series Chipset Family 6 port SATA AHCI Controller (rev 05) 00:1f.3 SMBus: Intel Corporation 6 Series/C200 Series Chipset Family SMBus Controller (rev 05) 01:00.0 Network controller: Intel Corporation Centrino Advanced-N 6205 (rev 34) 02:00.0 Ethernet controller: Realtek Semiconductor Co., Ltd. RTL8111/8168B PCI Express Gigabit Ethernet controller (rev 06) pro-hq@prohq-LIFEBOOK-LH531:~$

    Read the article

  • Ubuntu 64bit Black Screen on Minecraft

    - by Signify
    I have tried posting on the forums, but I really need help (I'm a server admin and really don't want to have to switch to Windows just to run Minecraft). Anyhow, I originally was running openjdk6 as I was told that 7 was unstable and was getting periodical lag spikes while walking (at least once every 3 seconds the screen would freeze for a tenth of a second). After that, I attempted to install Sun's Java JDK7 (I couldn't get ahold of 6 without signing up for Oracle's newsletters). Upon attempting to run Minecraft, I got a black screen after logging in with this error message: 27 achievements 182 recipes Setting user: Thunder7102, -1618112820878091307 Exception in thread "Minecraft main thread" java.lang.UnsatisfiedLinkError: /home/noiro/.minecraft/bin/natives/liblwjgl.so: /home/noiro/.minecraft/bin/natives/liblwjgl.so: wrong ELF class: ELFCLASS32 (Possible cause: architecture word width mismatch) at java.lang.ClassLoader$NativeLibrary.load(Native Method) at java.lang.ClassLoader.loadLibrary1(ClassLoader.java:1939) at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1864) at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1825) at java.lang.Runtime.load0(Runtime.java:792) at java.lang.System.load(System.java:1059) at org.lwjgl.Sys$1.run(Sys.java:69) at java.security.AccessController.doPrivileged(Native Method) at org.lwjgl.Sys.doLoadLibrary(Sys.java:65) at org.lwjgl.Sys.loadLibrary(Sys.java:81) at org.lwjgl.Sys.<clinit>(Sys.java:98) at org.lwjgl.opengl.Display.<clinit>(Display.java:132) at net.minecraft.client.Minecraft.a(SourceFile:184) at net.minecraft.client.Minecraft.run(SourceFile:657) at java.lang.Thread.run(Thread.java:722) Now, this got me fed up, so I tried to install a Windows 7 virtual machine through virtualbox, I gave it 256mb of graphics memory with 2D and 3D acceleration and 3GB of RAM. I installed Java JDK7 for Windows (which does work from experience on my other Windows 7 partition). Once again, a black screen after login. What the heck is going on guys? My System Specs: Ubuntu 12.04 64bit Fully Updated running Gnome3 Nvidia GTS 450 1.3GB OC'd AMD Athlon II 4x 2.8Ghz 6GB of RAM So, what do you think?

    Read the article

  • How to make my simple round sprite look right in XNA

    - by Joshua Perina
    Ok, I'm very new to graphics programming (but not new to coding). I'm trying to load a simple image in XNA which I can do fine. It is a simple round circle which I made in photoshop. The problem is the edges show up rough when I draw it on the screen even with the exact size. The anti-aliasing is missing. I'm sure I'm missing something very simple: GraphicsDevice.Clear(Color.Black); // TODO: Add your drawing code here spriteBatch.Begin(); spriteBatch.Draw(circle, new Rectangle(10, 10, 10, 10), Color.White); spriteBatch.End(); Couldn't post picture because I'm a first time poster. But my smooth png circle has rough edges. So I found that if I added: spriteBatch.Begin(SpriteSortMode.FrontToBack, BlendState.NonPremultiplied); I can get a smooth image when the image is the same size as the original png. But if I want to scale that image up or down then the rough edges return. How do I get XNA to smoothly resize my simple round image to a smaller size without getting the rough edges?

    Read the article

  • 12.10 no audio via hdmi and video speeds up

    - by jackson
    I have a laptop with an ati radeon 4200, on 12.04 everything worked fine, since upgrading to 12.10 I cannot get sound over the hdmi. When I switch to hdmi audio the video speeds up to about 2x. I can use the speakers in my laptop and watch video via hdmi with no problems. Things I have tried: Various tutorials to install the AMD/ATI drivers, all of which resulted in low graphics mode. Checked that everything is properly set in alsamixer, the sound utility and - installed pavucontrol and checked everything in there. Verified the output from cat /proc/asound/cards looks normal When I initially upgraded there was a plethora of problems which I believe were due to the old proprietary driver still being used but not compatible, after a few hours trying to fix that I decided just to back up and do a fresh install which works great except for the above stated problem. Any help would be greatly appreciated!! Finally hopefully this hasn't already been answered, I have tried a few different searches on the boards and haven't come up with anything. $ aplay -l **** List of PLAYBACK Hardware Devices **** card 0: SB [HDA ATI SB], device 0: ALC269VB Analog [ALC269VB Analog] Subdevices: 1/1 Subdevice #0: subdevice #0 card 1: HDMI [HDA ATI HDMI], device 3: HDMI 0 [HDMI 0] Subdevices: 0/1 Subdevice #0: subdevice #0

    Read the article

  • 3D physics engine for accurate collision handling on desktop/laptop computers (non-console)

    - by Georges Oates Larsen
    What are your suggestions for a physics engine that satisfies the following criteria? Capable of calculating collisions between multiple concave mesh-based colliders Handles many collisions going on at once (for instance one mesh being wedged between two others, which themselves may be wedged between two meshes) Does not allow for collider passthrough, even at high speeds. For instance, if I am applying force to a programmatically hinged object that makes it spin, I do not want it to pass through another rigidbody that it collides with while spinning. I have this problem using PhysX As implied before, reacts well to hinged objects, preferably has its own implementation of a hinge, but I am willing to program my own. The important part is that it has some sort of interface that guarantees accurate collision tracking even when dealing with these things Platform independent -- runs on mac as well as PC, also not tied down to specific graphics cards I think that's the best way to explain what I am looking for. Basically, I need SUPER reliable collisions. Something that can't be accomplished with a simple ray casting approach that sends a ray from the last position of the object to the current position (as this object may be potentially large and colliding with small objects via rotation) Bonus points for also including an OPEN SOURCE engine.

    Read the article

  • generating maps

    - by gardian06
    This is a conglomeration question when answering please specify which part you are addressing. I am looking at creating a maze type game that utilizes elevation. I have a few features I would like to have, but am unsure as to some of the implementation. I have done work doing fileIO maze generation (using a key to read the file, and then generate the level based on that file), but I am unsure how to think about this with elevation in the mix. I think height maps might be a good approach, but don't know how to represent them effectively. for a height map which is more beneficial XML(containing h[u,v] data and key definition), CSV (item1 is key reference, item2 is elevation), or another approach that I have not thought of yet? When it comes to placing the elevation values themselves what kind of deltah values are appropriate to have it noticeable at about a 60degree angle while not really effecting gravity driven physics (assuming some effect while moving up/down hill)? I am thinking of maybe going to procedural generation at some point, but am wondering if it is practical to have a procedurally generated grid (wall squares possibly same dimensions as the open space squares), or if designing to a thin wall open spaces is better? this decision will effect the amount of work need on the graphics end for uniform vs. irregular walls. EDIT: game will be a elevation maze shooter. levels/maps will be mazes with elevation the player has to negotiate. elevations will have effects on "combat" vision, and movement

    Read the article

  • What is a "Technical Programmer"? [closed]

    - by Mike E
    I've noticed in job posting boards a few postings, all from European companies in the games industry, for a "Technical Programmer". The job description in both was similar, having to do with tools development, 3d graphics programming, etc. It seems to be somewhere between a Technical Artist who's more technical than artist or who can code, and a Technical Director but perhaps without the seniority/experience. Information elsewhere on the position is sparse. The title seems redundant and I haven't seen any American companies post jobs by that name exactly. One example is this job posting on gamedev.net which isn't exactly thorough. In case the link dies: Subject: Technical Programmer Frictional Games, the creators of Amnesia: The Dark Descent and the Penumbra series, are looking for a talented programmer to join the company! You will be working for a small team with a big focus on finding new and innovating solutions. We want you who are not afraid to explore uncharted territory and constantly learn new things. Self-discipline and independence are also important traits as all work will be done from home. Some the things you will work with include: 3D math, rendering, shaders and everything else related. Console development (most likely Xbox 360). Hardware implementations (support for motion controls, etc). All coding is in C++, so great skills in that is imperative. As I mentioned, the job title has appeared from European companies so maybe it goes by another title in America. What other titles might this specialization of programmer go by?

    Read the article

  • Java Slick2d - Mouse picking how to take into account camera

    - by Corey
    When I move it it obviously changes the viewport so my mouse picking is off. My camera is just a float x and y and I use g.translate(-cam.cameraX+400, -cam.cameraY+300); to translate the graphics. I have the numbers hard coded just for testing purposes. How would I take into account the camera so my mouse picking works correctly. double mousetileX = Math.floor((double)mouseX/tiles.tileWidth); double mousetileY = Math.floor((double)mouseY/tiles.tileHeight); double playertileX = Math.floor(playerX/tiles.tileWidth); double playertileY = Math.floor(playerY/tiles.tileHeight); double lengthX = Math.abs((float)playertileX - mousetileX); double lengthY = Math.abs((float)playertileY - mousetileY); double distance = Math.sqrt((lengthX*lengthX)+(lengthY*lengthY)); if(input.isMousePressed(Input.MOUSE_LEFT_BUTTON) && distance < 4) { if(tiles.map[(int)mousetileX][(int)mousetileY] == 1) { tiles.map[(int)mousetileX][(int)mousetileY] = 0; } } That is my mouse picking code

    Read the article

  • I've inherited 200K lines of spaghetti code -- what now?

    - by kmote
    I hope this isn't too general of a question; I could really use some seasoned advice. I am newly employed as the sole "SW Engineer" in a fairly small shop of scientists who have spent the last 10-20 years cobbling together a vast code base. (It was written in a virtually obsolete language: G2 -- think Pascal with graphics). The program itself is a physical model of a complex chemical processing plant; the team that wrote it have incredibly deep domain knowledge but little or no formal training in programming fundamentals. They've recently learned some hard lessons about the consequences of non-existant configuration management. Their maintenance efforts are also greatly hampered by the vast accumulation of undocumented "sludge" in the code itself. I will spare you the "politics" of the situation (there's always politics!), but suffice to say, there is not a consensus of opinion about what is needed for the path ahead. They have asked me to begin presenting to the team some of the principles of modern software development. They want me to introduce some of the industry-standard practices and strategies regarding coding conventions, lifecycle management, high-level design patterns, and source control. Frankly, it's a fairly daunting task and I'm not sure where to begin. Initially, I'm inclined to tutor them in some of the central concepts of The Pragmatic Programmer, or Fowler's Refactoring ("Code Smells", etc). I also hope to introduce a number of Agile methodologies. But ultimately, to be effective, I think I'm going to need to hone in on 5-7 core fundamentals; in other words, what are the most important principles or practices that they can realistically start implementing that will give them the most "bang for the buck". So that's my question: What would you include in your list of the most effective strategies to help straighten out the spaghetti (and prevent it in the future)?

    Read the article

  • Banshee doesn't like opening websites

    - by Allan
    I have come across two bugs (which will be added to launchpad if it's not resolved here) When I open any of the websites in Banshee Amazon or Miro Guide as soon as the site is finished loading it crashes Banshee. If I play any video local or remote it will show 1 frame maybe 0.5 sec of video then I get a black screen and audio continues in the backgound. Specs & Details I have a Fujitsu Amilo 1718 laptop with 2 gig of ram (original 1 gig) graphics is provided by ATI Radeon Xpress 200M (don't laugh it works with compiz....just) I have a link to the output of banshee --debug Here Don't have time to read? Here are the Highlights [2 Warn 11:52:34.814] Caught an exception - System.ArgumentNullException: Argument cannot be null. then abit later Debug info from gdb: Could not attach to process. If your uid matches the uid of the target process, check the setting of /proc/sys/kernel/yama/ptrace_scope, or try again as the root user. For more details, see /etc/sysctl.d/10-ptrace.conf ptrace: Operation not permitted. ================================================================= Got a SIGSEGV while executing native code. This usually indicates a fatal error in the mono runtime or one of the native libraries used by your application. ================================================================= Aborted Not music to my ears as you can expect. The version I am using is 1.9.4 from the daily ppa but these bugs happen in any version of banshee from 1.8.1 and up. So if any one has come across a fix for this problem please share!! additional info Both VLC and Miro work on my system so there isn't a system wide problem with video and I haven't mentioned mono so no trolling it will get voted down.

    Read the article

  • Creating an update method in a different class

    - by Sweta Dwivedi
    I have created a class called 3D model which will animate my 3D model by changing the model position according to the values based in a .txt file through a list... Since i'm using a foreach loop to read the point values when it reaches the end of the file.. XNA throws an out of bounds exception .. (which is obvious) but if i add the same code in my Game.cs update(gameTime) method.. then i dont have this problem..Any idea how to make my 3D model update work same as the update in game.cs .. Here is the code for some idea: public void patterns(GameTime gameTime) { motion_z = new List<Point3D>(); if (pattern == 1) { f = "E:/Motion_Track-output/Output1.txt"; } if (pattern == 2) { f = "E:/Motion_Track-output/cruse.txt"; } // TODO: Add your update logic here using (StreamReader r = new StreamReader(f)) { string line; //Viewport view = graphics.GraphicsDevice.Viewport; int maxWidth = view.Width; int maxHeight = view.Height; while ((line = r.ReadLine()) != null) { string[] temp = line.Split(','); int x = (int)Math.Floor(((float.Parse(temp[0]) * 0.5f) + 0.5f) * maxWidth); int y = (int)Math.Floor(((float.Parse(temp[1]) * -0.5f) + 0.5f) * maxHeight); int z = (int)Math.Floor(((float.Parse(temp[2]) / 4 * 20000))); motion_z.Add(new Point3D(x, y, z)); } modelPosition.X = (float)(motion_z[i].X); modelPosition.Y = (float)(motion_z[i].Y); modelPosition.Z = (float)(motion_z[i].Z); i++; } //Console.WriteLine("modelposX:" + modelPosition.X + "," + "motionzX:" + motion_z[i].X); }

    Read the article

  • Error loading libGL.so.1

    - by jdp407
    When attempting to run various pieces of software (notably Steam and Yenka), I have come across an error similar to this: enter code here error while loading shared libraries: libGL.so.1: cannot open shared object file: No such file or directory I'm running a 64 bit system, with an NVidia Optimus card (I dual boot for certain windows only software that requires a dedicated graphics card). I have bumblebee installed, and I am using the nvidia-current driver, rather that one downloaded from NVidia, as recommended. The library (libGL.so.1) is not present in the top directory of /usr/lib, however it is present in /usr/lib32/nvidia-current, as a softlink to /usr/lib32/nvidia-current/libGL.so.304.64. A section of the output from ldconfig -p: libGL.so.1 (libc6,x86-64, OS ABI: Linux 2.4.20) => /usr/lib/x86_64-linux-gnu/mesa/libGL.so.1 libGL.so (libc6,x86-64, OS ABI: Linux 2.4.20) => /usr/lib/x86_64-linux-gnu/libGL.so libGL.so (libc6,x86-64, OS ABI: Linux 2.4.20) => /usr/lib/x86_64-linux-gnu/mesa/libGL.so Obviously a library with that name is being loaded, but they are located in /usr/lib/x86_64-linux-gnu, however installed software doesn't seem to able to 'see' it. For Steam, running it with optirun causes it to work, but this is not the case for Yenka. I assume that optirun causes the library stored in /usr/lib32/nvidia-current to be used, which allows Steam to run, so I can't understant why Yenka won't run. Can anyone explain why software can't see the normal mesa library, and why Yenka refuses to run with the nvidia-current library?

    Read the article

  • XNA 2D Spritesheet drawing rendering problem

    - by user24092
    I'm making a tile-based game, using one spritesheet containing all tile graphics. Each tile has a size of 32x32 pixels. The main problem is: when I draw the tile to the screen, if the tile position x and y are not rounded or if scale is activated in spriteBatch.Draw() method (scale != 1.0f), I get some lines of adjacent tiles on the spritesheet into the current tile drawed. I already tried setting SamplerState to PointClamp, removing AntiAlias, but still doesn't work. Here I'll show images of some tests that I made, with a test sprite sheet that I've created (I made a 9x9 spritesheet, with each sprite of size 32x32 containing a unique solid color). Tests: http://img6.imageshack.us/img6/5946/testsqj.png SpriteSheet used: http://imageshack.us/a/img821/1341/tilesm.png Already tried to remove anti-alias, set PointClamp as sampler state, but still getting this issue, XNA keeps drawing part of the adjacent pixels of the texture on the screen. What I want is to get the correct area of the tilesheet texture (as seen in the first test, that gets just the yellow pixels). My question is: Is there any way that I can fix this, WITHOUT adding tile spacing or any other modification involving the tilesheet? Maybe disabling a texture filtering that is done by XNA, or something like that.

    Read the article

  • Display Problems running Ubuntu 12.04 in Windows VMware Player

    - by Alex Reynolds
    I am posting this again because I have changed to VMWare Player and I discovered something new. I am running Ubuntu 12.04 LTS (Guest) in VMWare Player 6.0.2 on Windows 7 64-bit Host. I have VMWare Tools installed properly. I am running MATE but the problem persists when I change (to and from) xfce, gnome, mate a-- and back again I had been using this for a couple of years without any graphics issues. After an Ubuntu update (typical) -- my video is corrupt and will not refresh correctly. My desktop icons are "mirrored" and when I open a terminal window (for instance) -- sometimes -- the window appears multiple times. Of course, only one is the real image. It seems to be a refresh problem. When I move an active window around the screen the "older" images "erase" and my icons (for instance) are in the correct location. For instance, I can move the terminal around a pretty corrupted window and the screen behind it is repainted correctly. NEW: Next, I got the idea to try and Remote Desktop in to the Ubuntu 12.04 LTS Guest OS from my Windows 7 64-bit Host. Once working, and connected via Win RDC -- using MATE as default in my /etc/xrdp/startws.sh -- my video is not corrupted and works fine (as in the past). Any ideas about what is going on ? or how to fix this ?

    Read the article

  • OUCH! Laptop running SUPER HOT after 12.10 upgrade!

    - by dinkelk
    I was running 12.04 for 6 months, my laptop ran almost silently and cool enough to hold on my lap. I updated to 12.10 and now my computer gets too hot to hold on my lap and the fan is constantly running on full blast. This is the output of sensors: acpitz-virtual-0 Adapter: Virtual device temp1: +84.0°C (crit = +99.0°C) coretemp-isa-0000 Adapter: ISA adapter Physical id 0: +84.0°C (high = +86.0°C, crit = +100.0°C) Core 0: +74.0°C (high = +86.0°C, crit = +100.0°C) Core 1: +72.0°C (high = +86.0°C, crit = +100.0°C) Core 2: +75.0°C (high = +86.0°C, crit = +100.0°C) Core 3: +84.0°C (high = +86.0°C, crit = +100.0°C) radeon-pci-0100 Adapter: PCI adapter temp1: +76.0°C I have an HP Pavilion dv6, i7, amd radeon graphics. Please let me know if you need additional information. What could be different between the two Ubuntu editions that caused such a drastic change? Edit 1: Per @Paul's suggestion, I ran htop to try to narrow down the problem. Here is the result! (left side of terminal) (right side of terminal) This is about 10 minutes after boot-up, htop, yakuake, and a chrome page with 1 tab opened to this question are all that I have manually opened. The most taxing program to the CPU is htop itself. I think that the problem must lie elsewhere; my temps are already up to ~65C for the CPU and ~69C for the GPU, with nearly 0% CPU usage.

    Read the article

< Previous Page | 182 183 184 185 186 187 188 189 190 191 192 193  | Next Page >