Search Results

Search found 3136 results on 126 pages for 'buffer overrun'.

Page 72/126 | < Previous Page | 68 69 70 71 72 73 74 75 76 77 78 79  | Next Page >

  • Multiple vulnerabilities in Mozilla Firefox

    - by chandan
    CVE DescriptionCVSSv2 Base ScoreComponentProduct and Resolution CVE-2011-2372 Permissions, Privileges, and Access Controls vulnerability 3.5 Firefox web browser Solaris 11 11/11 SRU 3 Solaris 10 Contact Support CVE-2011-2995 Denial of Service (DoS) vulnerability 10.0 CVE-2011-2997 Denial of Service (DoS) vulnerability 10.0 CVE-2011-3000 Improper Control of Generation of Code ('Code Injection') vulnerability 4.3 CVE-2011-3001 Permissions, Privileges, and Access Controls vulnerability 4.3 CVE-2011-3002 Denial of Service (DoS) vulnerability 9.3 CVE-2011-3003 Denial of Service (DoS) vulnerability 10.0 CVE-2011-3004 Improper Input Validation vulnerability 4.3 CVE-2011-3005 Denial of Service (DoS) vulnerability 9.3 CVE-2011-3232 Improper Control of Generation of Code ('Code Injection') vulnerability 9.3 CVE-2011-3648 Improper Neutralization of Input During Web Page Generation ('Cross-site Scripting') vulnerability 4.3 CVE-2011-3650 Improper Restriction of Operations within the Bounds of a Memory Buffer vulnerability 9.3 CVE-2011-3651 Denial of Service (DoS) vulnerability 10.0 CVE-2011-3652 Denial of Service (DoS) vulnerability 10.0 CVE-2011-3654 Denial of Service (DoS) vulnerability 10.0 CVE-2011-3655 Improper Control of Generation of Code ('Code Injection') vulnerability 9.3 This notification describes vulnerabilities fixed in third-party components that are included in Sun's product distribution.Information about vulnerabilities affecting Oracle Sun products can be found on Oracle Critical Patch Updates and Security Alerts page.

    Read the article

  • lubuntu notify-send remove limit of 21?

    - by giuspen
    sending notifications with notify-send in lubuntu notify-send -i error -t 1000 "Error" "error notification" I can send only 21 of them, after that no more notifications sent, the only way to receive more notifications is to click on the panel where there's a letter with the number 21 and then click on the button "clear all notifications". Is there a way to avoid the need to go clicking the button, also is there a way to remove at all that letter with number of notifications received? UPDATE: I realize that notification-daemon (0.7.3) is used. I downloaded the sources and edited the source code (nd-queue.c - on_bubble_destroyed) to do not buffer but always destroy the bubbles but I would prefer another way...

    Read the article

  • Omni-directional shadow mapping

    - by gridzbi
    What is a good/the best way to fill a cube map with depth values that are going to give me the least amount of trouble with floating point imprecision? To get up and running I'm just writing the raw depth to the buffer, as you can imagine it's pretty terrible - I need to to improve it, but I'm not sure how. A few tutorials on directional lights divide the depth by W and store the Z/W value in the cube map - How would I perform the depth comparison in my shadow mapping step? The nvidia article here http://http.developer.nvidia.com/GPUGems/gpugems_ch12.html appears to do something completely different and use the dot of the light vector, presumably to counter the depth precision worsening over distance? He also scales the geometry so that it fits into the range -.5 +.5 - The article looks a bit dated, though - is this technique still reasonable? Shader code http://pastebin.com/kNBzX4xU Screenshot http://imgur.com/54wFI

    Read the article

  • Why does the MaxReceiveMessageSize in WCF matter in case of Streaming

    The default value of MaxReceiveMessageSize in WCF is 65,536.  When you choose streaming as TransferMode, WCF runtime will create 8192 as buffer size. So what happened now is that WCF channel stack will read the first 8192 bytes, and decode the first couple of bytes as the size of the entire envelope. Then we will do a size check, and send back fault if the actual size exceeds the limit.   According to MSDN documentation, the MaxReceiveMessageSize is something that prevents a DOS attack,...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Unable to install on a Samsung 305v5a

    - by Antony
    Have used Ubuntu for years now. Bought a Samsung 305v5a-so2 laptop yesterday. It runs an AMD A8 quadcore. I have a CD of 10.04 and as I am not clear about whether to install 32 or 64 bit I thought I would run the trial of ubuntu from the cd to see it. After about 30m started getting Authentification Failure messages. Squashfs-error Unable to read fragment cash entry Then a zillion Buffer Logical error messages like 17,000+ Should I go download 11.10, in 32bit or go and try the 64bit. Really don't want to screw the new laptop already but aint gonna wanna work with w7 either. Thanks for any help

    Read the article

  • Error in mounting HDD

    - by Vikramjeet
    I am getting the following error whenever I mount my external HDD. It was working before and then I opted for safely removing the drive. Now its giving me following error Error mounting: mount exited with exit code 13: ntfs_mst_post_read_fixup_warn: magic: 0x43425355 size: 4096 usa_ofs: 8850 usa_count: 65535: Invalid argument Actual VCN (0x800006009000000) of index buffer is different from expected VCN (0x0). Failed to mount '/dev/sdb1': Input/output error NTFS is either inconsistent, or there is a hardware fault, or it's a SoftRAID/FakeRAID hardware. In the first case run chkdsk /f on Windows then reboot into Windows twice. The usage of the /f parameter is very important! If the device is a SoftRAID/FakeRAID then first activate it and mount a different device under the /dev/mapper/ directory, (e.g. /dev/mapper/nvidia_eahaabcc1). Please see the 'dmraid' documentation for more details.

    Read the article

  • XAudio2 - Multiple instances of the same sound

    - by Boreal
    Right now, I'm adding a rudimentary sound engine to my game. So far, I am able to load in a WAV file and play it once, then free up the memory when I close the game. However, the game crashes with a nice ArgumentOutOfBoundsException when I try to play another sound instance. Specified argument was out of the range of valid values. Parameter name: readLength I'm following this tutorial pretty much exactly, but I still keep getting the aforementioned error. Here's my sound-related code. http://pastebin.com/FgaqfXTs The exception occurs on line 156 when I am playing the sound: source.SubmitSourceBuffer(buffer);

    Read the article

  • After the upgrading to 13.10, I can't input Japanese and Chinese in Emacs

    - by oda
    I have just upgraded Ubuntu from 13.04 to 13.10. It seems iBus have been made big changes.Then I just go to system setting - text entry settings - add "Chinese pinyin" and "Japanese anty" input method. It works well when I input Chinese or Japanese in terminal or .txt file. But when I want to input Chinese and Japanese in Emacs. Even though I have enable ibus-mode in the buffer and change to Chinese pinyin or Japanese anty input method. It just output the English word. Below is the ibus configure in .emacs.By the way, It works well before I upgrade Ubuntu to 13.10 and Emacs to 24.3.1. (add-to-list 'load-path (concat my-emacs-path "/ibus-el-0.3.2")) ;;(setq ibus-python-shell-command-name "python2.7") (require 'ibus) ;; Turn on ibus-mode automatically after loading .emacs (add-hook 'after-init-hook 'ibus-mode-on) (setq ibus-cursor-color '("red" "blue" "limegreen"))

    Read the article

  • Why can't I use Unity 3D on a ATI Mobility Radeon HD 4200 Series with FGLRX drivers?

    - by user88257
    With a brand new install of 12.04 on my Dell Inspiron M5030, Unity 3D appears to load everything but the icons in the top left, and I am unable to click on anything. Unity 2D seems to works fine however. I have done nothing except install Synaptic Package manager and follow this guide to install FGLRX Drivers under Settings ? Additional Drivers, the driver shows as installed and functioning. Also after running: /usr/lib/nux/unity_support_test -p, I get this: OpenGL vendor string: ATI Technologies Inc. OpenGL renderer string: ATI Mobility Radeon HD 4200 Series OpenGL version string: 3.3.11653 Compatibility Profile Context Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: yes I am unsure as to what the problem is. I have tried the non-proprietary drivers, I could not get them to work either, but I would be willing to try again.

    Read the article

  • Resources for 2D rendering using OpenGL?

    - by nightcracker
    I noticed that there is quite some difference between 3D and 2D rendering using OpenGL, the techniques are different - pixel-perfect placing is a lot more desirable, among other things. Are there any good (complete) references on using OpenGL for rendering 2D graphics? There are quite a few "tutorials" around on the net that help you open a window, set up a half-decent environment and draw a sprite, but no real good information on rotation, blending, lightning, drawing order, using the z-buffer, particles, "complex" primitives (circles, stars, cross symbols), ensuring pixel-perfect rendering, instancing and many other staple 2D effects/techniques. Any books, great blogs, anything? Any particular awesome libraries to read?

    Read the article

  • Is there a global "low resolution" filter for OpenGL?

    - by Ian Henry
    I'm trying to learn a little about OpenGL, so I'm making a simple 2D game (with OpenTK), and so far it's coming along well. I thought it would be fun to give it that, for lack of a better word, retropixelated look of games from the early nineties. I figured it would be an easy thing to do -- simply draw everything at half its normal size and scale up with no anti-aliasing. But I can't find any resources on how to do this. I can set the min/mag filters of my textures to nearest and that works fine for my sprites, but I'm using lots of primitives and I'd like the effect to apply to them as well. The one idea I had was to draw everything at half size, then somehow copy the render buffer to a texture, then render that texture full-size, but I don't know how to do that, and it seems like there must be a better way. Can anyone help me out?

    Read the article

  • Gnome 3 after run has graphic problems

    - by Antonis
    I have 3d accelarator but gnome still doesn't work My pc enters gnome desktop bu there i have graphic problems! Graphic problems with the top taskbar and wherever i click it, my desktop transforms into gnome clasic desktop. I am using ati radeon 4800 hd and ubuntu 11.10 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: yes Similar report from @user26930 - ATI HD5770 graphics card - gnome-shell is affected but Unity works fine.

    Read the article

  • Audio Panning using RtAudio

    - by user1801724
    I use the RtAudio library. I would like to implement an audio program where I can control the panning (e.g. shifting the sound from the left channel to the right channel). In my specific case, I use RtAudio in duplex mode (you can find an example here: duplex mode). It means that I link the microphone input to the speaker output. I have searched on the web, but I did not find anything useful. Should I apply a filter on the output buffer? What kind of filter?

    Read the article

  • Audio Panning using RtAudio

    - by user1801724
    I use Rtaudio library. I would like to implement an audio program where I can control the panning (e.g. shifting the sound from the left channel to the right channel). In my specific case, I use a duplex mode (you can find an example here: duplex mode). It means that I link the microphone input to the speaker output. I seek on the web, but I did not find anything useful. Should I apply a filter on the output buffer? What kind of filter? Can anyone help me? Thanks

    Read the article

  • How do I determine the draw order in an isometric view flash game?

    - by Gajet
    This is for a flash game, with isometric view. I need to know how to sort object so that there is no need for z-buffer checking when drawing. This might seem easy but there is another restriction, a scene can have 10,000+ objects so the algorithm needs to be run in less than O(n^2). All objects are rectangular boxes, and there are 3-4 objects moving in the scene. What's the best way to do this? UPDATE in each tile there is only object (I mean objects can stack on top of each other). and we access to both map of Objects and Objects have their own position.

    Read the article

  • What is the benefit of triple buffering?

    - by user782220
    I read everything written in a previous question. From what I understand in double buffering the program must wait until the finished drawing is copied or swapped before starting the next drawing. In triple buffering the program has two back buffers and can immediately start drawing in the one that is not involved in such copying. But with triple buffering if you're in a situation where you can take advantage of the third buffer doesn't that suggest that you are drawing frames faster than the monitor can refresh. So then you don't actually get a higher frame rate. So what is the benefit of triple buffering then?

    Read the article

  • Why do webpages take longer to loo in ubuntu 12.04 than Windows 7

    - by Emil Abraham
    For example, when I click on a Facebook picture, the picture remains pixelated for about 30 seconds, then starts to clear up. Or when I watch YouTube videos, I can't watch them on HD without running into buffer issues. Windows 7 is just much snappier. It might be an issue with the graphics card. Dualbooting Windows 7 64bit & Ubuntu 12.04 64 bit Specs: CPU: Intel® Core™ i7-2630QM CPU @ 2.00GHz × 8 RAM: 8GB DDR3 HDD: 50 GB to Ubuntu & Remaining 1.5 TB to Windows The interesting part: Graphics Card: On System Settings in Ubuntu: Intel® Sandybridge Mobile Graphics Card: What it should be: Radeon™ HD 7690M XT switchable graphics with 1024MB GDDR5 and up to 5093MB total graphics memory

    Read the article

  • Game Code Design for Rendering

    - by kuroutadori
    I first created a game on the iPhone and I'm now porting it to Android. I wrote most of the code in C++, but when it came to porting it wasn't so easy. The Android way is to have two threads, one for rendering and one for updating. This due to some devices blocking when updating the hardware. My problem is that I am coming from the iPhone. When I transition, say from the Menu to the Game, I would stop the Animation (Rendering) and load up the next Manager (the Menu has a Manager and so has the Game). I could implement the same thing on Android, but I have noticed on game ports like Quake, don't do this - as far as I can tell. I have learnt that I cannot just dynamically add another Renderer class the the tree because I will probably get a dequeuing buffer error - which I believe to be a problem with the OpenGL ES side. So how is it done?

    Read the article

  • Workaround the flip queue (AKA pre-rendered frames) in OpenGL?

    - by user41500
    It appears that some drivers implement a "flip queue" such that, even with vsync enabled, the first few calls to swap buffers return immediately (queuing those frames for later use). It is only after this queue is filled that buffer swaps will block to synchronize with vblank. This behavior is detrimental to my application. It creates latency. Does anyone know of a way to disable it or a workaround for dealing with it? The OpenGL Wiki on Swap Interval suggests a call to glFinish after the swap but I've had no such luck with that trick.

    Read the article

  • Render full-screen gradient or texture

    - by Filip Skakun
    What's the simplest way to fill the background of the screen with a gradient or a texture in Direct3D 10/11? I'm building a Windows 8 metro app in which the camera never moves and I render some content in D3D, but I need to fill the background with something else than a solid color. Do I need to figure out the size and position of a rectangle and position it in 3D space or can I have some simpler solution? I don't care about depth at all, I don't use any depth buffer since all my content is sorted back to front, so I could just start by drawing to the background.

    Read the article

  • How are Java ByteBuffer's limit and position variable's updated?

    - by Dummy Derp
    There are two scenarios: writing and reading Writing: Whenever I write something to the ByteBuffer by calling its put(byte[]) method the position variable is incremented as: current position + size of byte[] and limit stays at the max. If, however, I put the data in a view buffer then I will have to, manually, calculate and update the position Before I call the write(ByteBuffer) method of the channel to write something, I will have to flip() the Bytebuffer so that position points to zero and limit points to the last byte that was written to the ByteBuffer. Reading: Whenever I call the read(ByteBuffer) method of a channel to read something, the position variable stays at 0 and the limit variable of the ByteBuffer points to the last byte that was read. So, if the ByteBuffer is smaller than the file being read, the limit variable is pushed to max This means that the ByteBuffer is already flipped and I can proceed to extracting the values from the ByteBuffer. Please, correct me where I am wrong :)

    Read the article

  • For 2D games, is there any reason NOT to use a 3D API like Direct3D or OpenGL?

    - by Eric Palakovich Carr
    I've been out of hobby Game Development for quite a while now. Back when I did it, most people used Direct Draw to create 2D games. By the time I stopped people were saying OpenGL or Direct3D with an orthogonal projection is just the way to go. I'm thinking about getting back into creating 2D games, in particular on mobile phone but maybe on the XNA platform as well. To make something using OpenGL I'd have a (hopefullly) small learning curve to acclimate myself to 3D development. Is there any reason to skip that and instead work with a 2D framework where I just have a Width x Height frame buffer I need to fill with pixels?

    Read the article

  • Skip the first RenderTarget when writing to MRT with Opaque blending

    - by cubrman
    I am writing to three rendertargets and whant to know how to tell a GPU not to write to the first RT. When you write a shader you can simply output less data than you have RTs (like output a single float4 when writing to three RTs) and only the first RTs will be affected, but you cannot specify to output this data anywhere else but to COLOR0, then 1, etc. Is there a way to write to several RTs but skip the first target? If I output zeroes, the data in the target will become zeroes, but I need it to remain untuched in the first target and only change in the specified ones. The reason I need this is to prevent data loss when calling SetRendertarget() with DiscardContents RTs. I write to all the RTs at one point and I need to write to only the specified ones afterwards. It must be the first texture as I have a depth buffer linked to it (XNA 4.0). Thanks.

    Read the article

  • why is emacs allowing multiple instances?

    - by Chad
    Around the time I fresh installed Ubuntu 12.04, I noticed that I can start multiple instances of Emacs. I find this annoying because I will think that a buffer should be open, but I'm in the wrong Emacs window. I may have changed something in .emacs, but I really don't think I did. I also reverted all of my customizations that are stored in ~/.emacs.d/custom.el. Emacs previously would give some error about another emacs server being open when I would attempt to start an additional instance of it, but it no longer does this. Any ideas on how to restore this behavior?

    Read the article

  • nVidia GeForce Go 7600? can it ever run unity?

    - by Khaled Musleh
    my laptop Toshiba Qosmio G30 has nVidia GeForce Go 7600 card and it suppose to support 3D . i run unity 2d now . I run 12.04 and the graphic driver is--VESA: G73 Board - toshg73m-- by UBUNTU. when i run /usr/lib/nux/unity_support_test -p then i get this list Not software rendered: no Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no the card is not blacklisted but a similar one with GT is! Do you think that there is a chance the laptop can run the unity 3d? and may be i could change the resolution of the screen to a higher one too! I tried all the nvidia drivers provided but none works (except 96 in ubuntu 12.04 ). i get a black screen or terminal screen. best wishes to all

    Read the article

< Previous Page | 68 69 70 71 72 73 74 75 76 77 78 79  | Next Page >