Search Results

Search found 17486 results on 700 pages for 'debug mode'.

Page 23/700 | < Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >

  • Can I customize the indentation of ternary operators in emacs' cperl-mode?

    - by Ryan Thompson
    In emacs cperl-mode, ternary operators are not treated specially. If you break them over multiple lines, cperl-mode simply indents each line the same way it indents any continued statement, like this: $result = ($foo == $bar) ? 'result1' : ($foo == $baz) ? 'result2' : ($foo == $qux) ? 'result3' : ($foo == $quux) ? 'result4' : fail_result; This is not very readable. Is there some way that I can convince cperl-mode indent like this? $result = ($foo == $bar) ? 'result1' : ($foo == $baz) ? 'result2' : ($foo == $qux) ? 'result3' : ($foo == $quux) ? 'result4' : fail_result; By the way, code example from this question.

    Read the article

  • Unable to debug XBAP with Visual Studio 2010

    - by Oleg I.
    Just migrated my project to Visual Studio 2010, but target framework was left 3.5. Project contains an XBAP app in partial trust and a bunch of WCF services. Debugging is configured to start PresentationHost.exe with -debug and -debugSecurityZoneUrl parameters. Under VS2008 everything works fine, and in VS2010 Beta2 (don't sure about RC), but under VS2010 RTM debugging is for some reason doesn't working. Application runs, but doesn't hit any breakpoint. And if for example exception occurs, message box appears "Do you wish to debug or close..." and after I choose "debug" option new weird message box appears: --------------------------- Warning --------------------------- A debugger is attached to PresentationHost.exe but not configured to debug this unhandled exception. To debug this exception, detach the current debugger. An unhandled exception was raised from Microsoft .NET Framework v 1.0, 1.1, or 2.0, but the current debugger is configured to debug Microsoft .NET Framework v4.0 code. Examine the exception using the SOS tool. --------------------------- OK --------------------------- And where is the vaunted multitargeting? Did anyone have already bumped into same issue? UPDATE: Tried to debug with "Start browser with URL" option. Debugging is working, but I get SecurityException. So it is possible, just need to figure out how to make it work with "Start external program" option. UPDATE2: Checked what PresentationHost is actually loads in both scenarios: "Start external program" - Latest version (4.0.31106.0) from C:\Windows\System32\ "Start browser with URL" - Old version (3.0.6920.4902) from C:\Windows\winsxs\x86_wpf-presentationhostexe_31bf3856ad364e35_6.1.7600.16385_none_6fca8974817173aa

    Read the article

  • Dynamic External Program in debug tab vs2008

    - by Justin Holbrook
    I am playing with NServiceBus using the generic host; specifically I'm working on having 2 different configurations, a debug configuration that logs to the console and a release version that logs to metabase (I'm using VS2008). I had just made some code changes (commented out a logging statement), but it was still showing in the log when I ran my solution. I eventually figured out that I had switched configuration to release, made my change, then built. I think the change isn’t being picked up because in the debug tab of my project properties I have the following (abbreviated) path to the generic host: C:...\Inventory\bin\Debug\NServiceBus.Host.exe Notice it specifically points to the debug directory. So basically even though I’m in release config it’s firing up the host in the debug directory which I think is then using the dll's in the debug directory (which is why my changes didn't get picked up). I tried to come up with a workaround, but have been unsuccessful. VS Macros (like $(Configuration)) and relative pathing are not allowed here. http://connect.microsoft.com/VisualStudio/feedback/details/422223/relative-path-not-allowed-in-c-project-debug-properties-window Any ideas? I hope this doesn’t require a custom build task.

    Read the article

  • Fedora 17 - Dropping into debug shell after attempted partitioning

    - by i.h4d35
    So I tried creating a new partition on Fedora 17 using fdisk as follows: Command (m for help): n Command action e extended p primary partition (1-4) p Partition number (1-4): 1 First cylinder (2048-823215039, default 2048): Using default value 2048 Last cylinder or +size or +sizeM or +sizeK (1-9039, default 9039): +15G Once this was done,instead of formatting the partition I created, I ran the partprobe command to write the changes to the partition table. On rebooting the computer, it drops to the debug shell and gives me the error as follows: dracut warning:unable to process initqueue dracut warning:/dev/disk/by-uuid/vg_mymachine does not exist dropping to debug shell dracut:/# While trying to run fsck on the said partition from the debug shell, it says "etc/fstab not found" and inside /etc I see a fstab.empty file. Is it now possible to retrieve what I have from the computer? Any help would be appreciated. Thanks in advance Edit: I've also tried the following steps for additional troubleshooting: I tried to boot using the Fedora disk and tried the rescue mode - says no Linux partition detected. I tried to create an fstab file by combining the entries from blkid and the /etc/mtab file and using the UUIDs from the mtab file - It didn't work. As soon as I rebooted the machine, it promptly dropped me in to the debug shell and the fstab file which i created wansn't there anymore in /etc (part of this solution)

    Read the article

  • Software: Launching League of Legends spectator mode from Command Line (Mac)

    - by Alex Popov
    Background: tl;dr at the end League of Legends has a spectator mode, in which you can watch someone else's game (essentially a replay) with a 3 minute delay. Popular LoL website OP.GG has figured out a clever way of hosting these spectator games on their own servers, thereby making them replayable, as opposed to only being available while the game is on (as Riot does it). If you request a replay from OP.GG, it sends a batch file which looks for where the League is situated and then the magic happens: @start "" "League of Legends.exe" "8394" "LoLLauncher.exe" "" "spectator fspectate.op.gg:4081 tjJbtRLQ/HMV7HuAxWV0XsXoRB4OmFBr 1391881421 NA1" This works fine on Windows. I'm trying to get it to work on Mac (which has an official client). First I tried running the same command by hand, (split for convenience) /Applications/ ... /LeagueOfLegends.app/ ... /LeagueofLegends 8393 LoLLauncher \ /Applications/ ... /LolClient spectator fspectate.op.gg:4081 tjJbtRLQ/HMV7HuAxWV0XsXoRB4OmFBr 1391881421 NA1 Running this, however, just starts the LoLLauncher, which closes all the active League processes. The exactly same thing happens if I just call /Applications/ ... /LeagueOfLegends.app/ ... /LeagueofLegends Next I tried seeing what actually happens when Spectator mode is initiated so I ran $ ps -axf | grep -i lol which showed UID PID PPID C STIME TTY TIME CMD 503 3085 1 0 Wed02pm ?? 0:00.00 (LolClient) 503 24607 1 0 9:19am ?? 0:00.98 /Applications/League of Legends.app/Contents/LOL/RADS/system/UserKernel.app/Contents/MacOS/UserKernel updateandrun lol_launcher LoLLauncher.app 503 24610 24607 0 9:19am ?? 1:08.76 /Applications/League of Legends.app/Contents/LoL/RADS/projects/lol_launcher/releases/0.0.0.122/deploy/LoLLauncher.app/Contents/MacOS/LoLLauncher 503 24611 24610 0 9:19am ?? 1:23.02 /Applications/League of Legends.app/Contents/LoL/RADS/projects/lol_air_client/releases/0.0.0.127/deploy/bin/LolClient -runtime .\ -nodebug META-INF\AIR\application.xml .\ -- 8393 503 24927 24610 0 9:44am ?? 0:03.37 /Applications/League of Legends.app/Contents/LoL/RADS/solutions/lol_game_client_sln/releases/0.0.0.117/deploy/LeagueOfLegends.app/Contents/MacOS/LeagueofLegends 8394 LoLLauncher /Applications/League of Legends.app/Contents/LoL/RADS/projects/lol_air_client/releases/0.0.0.127/deploy/bin/LolClient spectator 216.133.234.17:8088 Yn1oMX/n3LpXNebibzUa1i3Z+s2HV0ul 1400781241 NA1 Of Interest: there is (LolClient) which I cannot kill by it's PID. UserKernel updateandrun lol_launcher LoLLauncher.app is launched first. LoLLauncher is launched by the UserKernel (as we can see from the PPID) The very long command (PID: 24927) is how Spectator mode is launched, and is also launched by UserKernel. Spectator mode is launched in exactly the same way that the OP.GG .bat wanted to, with the only difference that Spectator mode connects to Riot instead of OP.GG's spectate server. I tried attaching GDB to the LolClient, but I couldn't get anything meaningful from it since it's an Adobe AIR application (and I've never used GDB with code other than mine own). Next I ran dtruss -a -b 100m -f -p $PID on everything I could think of: the LolClient, the LolLauncher and the UserKernel and skimmed the half a million lines produced. I found stuff like the GET request used to get the information of the game to spectate, but I could not see any launch of the equivalent of League of Legends.exe with spectator options. Finally, I ran lsof | grep -i lol to see if anything else was opened in the process, but didn't find anything that seemed appropriate. Open were UserKernel, LolLauncher, LolClient, Adobe AIR, LeagueofLegends and then Bugsplat, all of which are expected. None of this seemed especially relevant to figuring out how LeagueofLegends was opened into spectator mode. It obviously can be done, since Spectator mode is accessible from within the client. It seems likely that it can be done from the CLI, since Windows can do it and the clients are supposed to equals. Unless I'm missing something in the difference between how UNIX and Windows handle CLI application launches. My question is if there are any other things I can try to figure out how to launch Spectator mode myself. tl;dr: Trying to get into spectator mode from the CLI. It's possible on Windows (see first code block) but it just restarts League on Mac. What else can I try to find what call is made, and how to reproduce it? PS: Please let me know how I can improve this question or its formatting, I'd love to use StackOverflow/SuperUser, but as the guys said on the podcast this week (Ep. 59) it's very intimidating. Sorry for posting this on StackOverflow the first time :(

    Read the article

  • It's like I'm in recovery mode after update, but I'm not

    - by mawburn
    I used the Ubuntu software updater and updated to the most recent packages. After the last update today, it's like I have gone into recovery mode, but I haven't. I am running UbuntuGNOME First, everything looks like this: Switching to dark mode does nothing. Also, default applications do not work. Such as Startup and the default screenshot application. Everything was working fine before the latest software update. System Info Ubuntu 14.04 LTS Gnome-Shell 3.10.4 Kernel 3.13.0-29 I can't figure out how to get an update history, but this is almost a fresh install. It's about a week old install and this is the 3rd time I've used the Ubuntu Software Update. I am running AMD ATI HD6700 with the proprietary Catalyst drivers. I tried to provide all information that I thought would be useful, if you need any more please let me know. Edit - I believe something went wrong within these updates: Update Log: Start-Date: 2014-06-09 19:07:07 Commandline: aptdaemon role='role-commit-packages' sender=':1.68' Install: libgnome-desktop-3-10:amd64 (3.12.0-0~eugenesan~trusty2) Upgrade: gnome-session-common:amd64 (3.9.90-0ubuntu12, 3.12.0-0~eugenesan~trusty10), gnome-session-bin:amd64 (3.9.90-0ubuntu12, 3.12.0-0~eugenesan~trusty10), gir1.2-gnomedesktop-3.0:amd64 (3.8.4-0ubuntu3, 3.12.0-0~eugenesan~trusty2), gnome-session:amd64 (3.9.90-0ubuntu12, 3.12.0-0~eugenesan~trusty10), python-libxml2:amd64 (2.9.1+dfsg1-3ubuntu4.1, 2.9.1+dfsg1-3ubuntu4.2), libspice-server1:amd64 (0.12.4-0nocelt2, 0.12.4-0nocelt2.02~eugenesan~trusty1), gir1.2-mutter-3.0:amd64 (3.10.4-0ubuntu2, 3.10.4-0ubuntu2.1), xserver-xorg-video-qxl:amd64 (0.1.1-0ubuntu3, 0.1.1-0ubuntu3.01), libxml2:amd64 (2.9.1+dfsg1-3ubuntu4.1, 2.9.1+dfsg1-3ubuntu4.2), libxml2:i386 (2.9.1+dfsg1-3ubuntu4.1, 2.9.1+dfsg1-3ubuntu4.2), gnome-desktop3-data:amd64 (3.8.4-0ubuntu3, 3.12.0-0~eugenesan~trusty2), mutter:amd64 (3.10.4-0ubuntu2, 3.10.4-0ubuntu2.1), mutter-common:amd64 (3.10.4-0ubuntu2, 3.10.4-0ubuntu2.1), libxml2-utils:amd64 (2.9.1+dfsg1-3ubuntu4.1, 2.9.1+dfsg1-3ubuntu4.2), libmutter0c:amd64 (3.10.4-0ubuntu2, 3.10.4-0ubuntu2.1) End-Date: 2014-06-09 19:07:12 I also installed Citrix Receiver today, following the tutorial here: Citrix Receiver 12.1 on Ubuntu 14.04 64-bit Log Start-Date: 2014-06-09 18:59:06 Commandline: apt-get install libmotif4:i386 nspluginwrapper lib32z1 libc6-i386 libxp6:i386 libxpm4:i386 libasound2:i386 Install: libmotif-common:amd64 (2.3.4-5, automatic), libatk1.0-0:i386 (2.10.0-2ubuntu2, automatic), libxft2:i386 (2.3.1-2, automatic), libgraphite2-3:i386 (1.2.4-1ubuntu1, automatic), nspluginviewer:i386 (1.4.4-0ubuntu5, automatic), libpango-1.0-0:i386 (1.36.3-1ubuntu1, automatic), libxcursor1:i386 (1.1.14-1, automatic), libmotif4:i386 (2.3.4-5), libxm4:amd64 (2.3.4-5, automatic), libxm4:i386 (2.3.4-5, automatic), libxp6:i386 (1.0.2-1ubuntu1), libpangocairo-1.0-0:i386 (1.36.3-1ubuntu1, automatic), libxcb-render0:i386 (1.10-2ubuntu1, automatic), libthai0:i386 (0.1.20-3, automatic), libharfbuzz0b:i386 (0.9.27-1, automatic), libpixman-1-0:i386 (0.30.2-2ubuntu1, automatic), libpangoft2-1.0-0:i386 (1.36.3-1ubuntu1, automatic), libcairo2:i386 (1.13.0~20140204-0ubuntu1, automatic), lib32z1:amd64 (1.2.8.dfsg-1ubuntu1), libjasper1:i386 (1.900.1-14ubuntu3, automatic), libgtk2.0-0:i386 (2.24.23-0ubuntu1.1, automatic), nspluginwrapper:amd64 (1.4.4-0ubuntu5), libuil4:amd64 (2.3.4-5, automatic), libuil4:i386 (2.3.4-5, automatic), libxcb-shm0:i386 (1.10-2ubuntu1, automatic), libxmu6:i386 (1.1.1-1, automatic), libc6-i386:amd64 (2.19-0ubuntu6), libxinerama1:i386 (1.1.3-1, automatic), libgdk-pixbuf2.0-0:i386 (2.30.7-0ubuntu1, automatic), libxcomposite1:i386 (0.4.4-1, automatic), libmrm4:amd64 (2.3.4-5, automatic), libmrm4:i386 (2.3.4-5, automatic), libdatrie1:i386 (0.2.8-1, automatic), libxrandr2:i386 (1.4.2-1, automatic), libxpm4:i386 (3.5.10-1) End-Date: 2014-06-09 18:59:11

    Read the article

  • network will not work properly after having run TCP optimizer, but safe mode settings work perfectly. how to restore?

    - by michele
    I was experiencing some issues with my connection while playing online and I tried to optimize it by running TCP optimizer on my PC (Windows 7 64bit professional). I thought maybe the situation could improve. but it didn't. actually, I now get an extremely slow page loading time, probably due to a very low RWIN value of 1024. I understand that Windows 7 has a system to automatically adjust the RWIN value when needed. The setting from netsh is "normal" so I guess something else must be wrong. I tried every automatic tool out there to restore Windows' default values, but I had no success. I currently have what should be labeled as "default values" for everything TCP Optimizer initially changed, but the problem persists. The thing is, I just found out that running Windows in safe mode SOLVES the problem completely. The problem is that as soon as I reboot, I get the same issue all over again. So my question is: is there a way to use SAFE MODE network settings in NORMAL mode?

    Read the article

  • Getting logging.debug() to work on Google App Engine/Python

    - by brainjam
    I'm just getting started on building a Python app for Google App Engine. In the localhost environment I'm trying to send debug info to the GoogleAppEngineLauncher Log Console via logging.debug(), but it isn't showing up. However, anything sent through, say, logging.info() or logging.error() does show up. I've tried a logging.basicConfig(level=logging.DEBUG) before the logging.debug(), but to no avail. What am I missing?

    Read the article

  • How to send a Java integer in four bytes to another application?

    - by user1468729
    public void routeMessage(byte[] data, int mode) { logger.debug(mode); logger.debug(Integer.toBinaryString(mode)); byte[] message = new byte[8]; ByteBuffer byteBuffer = ByteBuffer.allocate(4); ByteArrayOutputStream baoStream = new ByteArrayOutputStream(); DataOutputStream doStream = new DataOutputStream(baoStream); try { doStream.writeInt(mode); } catch (IOException e) { logger.debug("Error converting mode from integer to bytes.", e); return; } byte [] bytes = baoStream.toByteArray(); bytes[0] = (byte)((mode >>> 24) & 0x000000ff); bytes[1] = (byte)((mode >>> 16) & 0x000000ff); bytes[2] = (byte)((mode >>> 8) & 0x00000ff); bytes[3] = (byte)(mode & 0x000000ff); //bytes = byteBuffer.array(); for (byte b : bytes) { logger.debug(b); } for (int i = 0; i < 4; i++) { //byte tmp = (byte)(mode >> (32 - ((i + 1) * 8))); message[i] = bytes[i]; logger.debug("mode, " + i + ": " + Integer.toBinaryString(message[i])); message[i + 4] = data[i]; } broker.routeMessage(message); } I've tried different ways (as you can see from the commented code) to convert the mode to four bytes to send it via a socket to another application. It works well with integers up to 127 and then again with integers over 256. I believe it has something to do with Java types being signed but don't seem to get it to work. Here are some examples of what the program prints. 127 1111111 0 0 0 127 mode, 0: 0 mode, 1: 0 mode, 2: 0 mode, 3: 1111111 128 10000000 0 0 0 -128 mode, 0: 0 mode, 1: 0 mode, 2: 0 mode, 3: 11111111111111111111111110000000 211 11010011 0 0 0 -45 mode, 0: 0 mode, 1: 0 mode, 2: 0 mode, 3: 11111111111111111111111111010011 306 100110010 0 0 1 50 mode, 0: 0 mode, 1: 0 mode, 2: 1 mode, 3: 110010 How is it suddenly possible for a byte to store 32 bits? How could I fix this?

    Read the article

  • Bricked - tty mode incorrect, terminal disabled, unity broken

    - by user989805
    After installing VirtualBox unity seems to be uninstalled.. Launcher and task bar are missing. I can't access terminal from the normal ctrl alt T command. I tried logging into TTY mode but it says my login is incorrect. I am trying to login with my normal ubuntu username and password. I have searched for solutions but since I dont have terminal or TTY I cant bash out any commands, and I have tried safeboot but the root terminal always says it cant run whatever command I enter. I am really lost and I am trying to avoid a fresh install :( I greatly and genuinely appreciate any help! This is the computer I use for college!

    Read the article

  • Often "running in low-graphics mode” with NVidia GeForce GT 430

    - by user42228
    Often (every fifth time perhaps) when I turn on my system, I'm greeted with the "running in low-graphics mode” window. Sometimes all I have to do is reboot and the system will start correctly, sometimes there can be a longer series of failed starts. If I go to the terminal and log in, I can run startx and my desktop will appear without problems. At the end of /var/log/Xorg.1.log I see this: [ 77.459] (EE) Screen(s) found, but none have a usable configuration. [ 77.459] Fatal server error: [ 77.459] no screens found [ 77.459] Please consult the The X.Org Foundation support at http://wiki.x.org Is the NVidia creating the appropriate XOrg configuration dynamically on every boot, or what could explain that the configuration error above only occurs sometimes?

    Read the article

  • Atmospheric Scattering

    - by Lawrence Kok
    I'm trying to implement atmospheric scattering based on Sean O`Neil algorithm that was published in GPU Gems 2. But I have some trouble getting the shader to work. My latest attempts resulted in: http://img253.imageshack.us/g/scattering01.png/ I've downloaded sample code of O`Neil from: http://http.download.nvidia.com/developer/GPU_Gems_2/CD/Index.html. Made minor adjustments to the shader 'SkyFromAtmosphere' that would allow it to run in AMD RenderMonkey. In the images it is see-able a form of banding occurs, getting an blueish tone. However it is only applied to one half of the sphere, the other half is completely black. Also the banding appears to occur at Zenith instead of Horizon, and for a reason I managed to get pac-man shape. I would appreciate it if somebody could show me what I'm doing wrong. Vertex Shader: uniform mat4 matView; uniform vec4 view_position; uniform vec3 v3LightPos; const int nSamples = 3; const float fSamples = 3.0; const vec3 Wavelength = vec3(0.650,0.570,0.475); const vec3 v3InvWavelength = 1.0f / vec3( Wavelength.x * Wavelength.x * Wavelength.x * Wavelength.x, Wavelength.y * Wavelength.y * Wavelength.y * Wavelength.y, Wavelength.z * Wavelength.z * Wavelength.z * Wavelength.z); const float fInnerRadius = 10; const float fOuterRadius = fInnerRadius * 1.025; const float fInnerRadius2 = fInnerRadius * fInnerRadius; const float fOuterRadius2 = fOuterRadius * fOuterRadius; const float fScale = 1.0 / (fOuterRadius - fInnerRadius); const float fScaleDepth = 0.25; const float fScaleOverScaleDepth = fScale / fScaleDepth; const vec3 v3CameraPos = vec3(0.0, fInnerRadius * 1.015, 0.0); const float fCameraHeight = length(v3CameraPos); const float fCameraHeight2 = fCameraHeight * fCameraHeight; const float fm_ESun = 150.0; const float fm_Kr = 0.0025; const float fm_Km = 0.0010; const float fKrESun = fm_Kr * fm_ESun; const float fKmESun = fm_Km * fm_ESun; const float fKr4PI = fm_Kr * 4 * 3.141592653; const float fKm4PI = fm_Km * 4 * 3.141592653; varying vec3 v3Direction; varying vec4 c0, c1; float scale(float fCos) { float x = 1.0 - fCos; return fScaleDepth * exp(-0.00287 + x*(0.459 + x*(3.83 + x*(-6.80 + x*5.25)))); } void main( void ) { // Get the ray from the camera to the vertex, and its length (which is the far point of the ray passing through the atmosphere) vec3 v3FrontColor = vec3(0.0, 0.0, 0.0); vec3 v3Pos = normalize(gl_Vertex.xyz) * fOuterRadius; vec3 v3Ray = v3CameraPos - v3Pos; float fFar = length(v3Ray); v3Ray = normalize(v3Ray); // Calculate the ray's starting position, then calculate its scattering offset vec3 v3Start = v3CameraPos; float fHeight = length(v3Start); float fDepth = exp(fScaleOverScaleDepth * (fInnerRadius - fCameraHeight)); float fStartAngle = dot(v3Ray, v3Start) / fHeight; float fStartOffset = fDepth*scale(fStartAngle); // Initialize the scattering loop variables float fSampleLength = fFar / fSamples; float fScaledLength = fSampleLength * fScale; vec3 v3SampleRay = v3Ray * fSampleLength; vec3 v3SamplePoint = v3Start + v3SampleRay * 0.5; // Now loop through the sample rays for(int i=0; i<nSamples; i++) { float fHeight = length(v3SamplePoint); float fDepth = exp(fScaleOverScaleDepth * (fInnerRadius - fHeight)); float fLightAngle = dot(normalize(v3LightPos), v3SamplePoint) / fHeight; float fCameraAngle = dot(normalize(v3Ray), v3SamplePoint) / fHeight; float fScatter = (-fStartOffset + fDepth*( scale(fLightAngle) - scale(fCameraAngle)))/* 0.25f*/; vec3 v3Attenuate = exp(-fScatter * (v3InvWavelength * fKr4PI + fKm4PI)); v3FrontColor += v3Attenuate * (fDepth * fScaledLength); v3SamplePoint += v3SampleRay; } // Finally, scale the Mie and Rayleigh colors and set up the varying variables for the pixel shader vec4 newPos = vec4( (gl_Vertex.xyz + view_position.xyz), 1.0); gl_Position = gl_ModelViewProjectionMatrix * vec4(newPos.xyz, 1.0); gl_Position.z = gl_Position.w * 0.99999; c1 = vec4(v3FrontColor * fKmESun, 1.0); c0 = vec4(v3FrontColor * (v3InvWavelength * fKrESun), 1.0); v3Direction = v3CameraPos - v3Pos; } Fragment Shader: uniform vec3 v3LightPos; varying vec3 v3Direction; varying vec4 c0; varying vec4 c1; const float g =-0.90f; const float g2 = g * g; const float Exposure =2; void main(void){ float fCos = dot(normalize(v3LightPos), v3Direction) / length(v3Direction); float fMiePhase = 1.5 * ((1.0 - g2) / (2.0 + g2)) * (1.0 + fCos*fCos) / pow(1.0 + g2 - 2.0*g*fCos, 1.5); gl_FragColor = c0 + fMiePhase * c1; gl_FragColor.a = 1.0; }

    Read the article

  • Beat detection, weird detection

    - by Quincy
    I made this soundanalyzer class to detect beats in songs : // put it on pastebin for the big size, will put it here if people rather want that. pastebin.com/8PdgZPP3 but for some reason its only detecting beats from 637 sec to around 641(sec) and I have no idea why. I know the beats are being inserted from multiple bands since I am finding duplicates and it seems as its assigning a beat to each instant energy value in between those values. Its modeled after this : http://www.flipcode.com/misc/BeatDetectionAlgorithms.pdf So why won't the beats properly register ?

    Read the article

  • How to find which w3wp.exe to attach when debugging your SharePiont2010 project

    - by ybbest
    When debugging SharePoint2010 project, you need to attach w3wp.exe process, however there are often quite a few of them and it is very hard to figure out which one to attach. Today, I will show you how to find out which process to attach using a tool called process explorer. 1. Download the process explorer and run it after you download it. 2. Find the w3wp.exe processes under wininit.exe right-click the columns header and click Select Columns. 3. Include Command Line under Process Image. 4. Now you can see your IIS site name next to w3wp.exe, in my case I’d like to attach the “SharePoint – BenDev80″.You can see the PID of the process is 2920. 5. From the above process you know the process ID you’d like to attach is 2920, you can then go ahead to attach the process from Visual Studio.

    Read the article

  • Standby Problem, PC can not enter Standby mode

    - by Leon95
    I have a problem with the Standby function: My Computer does not enter the Standby mode with Ubuntu 14.04LTS. If I remeber me right it, works with Ubuntu 13.10 but this Version was not long installed on this PC. Now when i press Standby in the menu or on my Keyboard the Display turns black for a few seconds, then some messages appear for a very short moment on the screen. After that, the the log-in screen appear. Two times I was able to enter Standby but the other times it fails. Tecnical Data about my PC: Ubuntu 14.04 with all Updates main storage: 3,8GiBprocessor: Intel® Core™ i3-2330M CPU @ 2.20GHz × 4 graphic: Intel® Sandybridge Mobile graphic board: NVIDA GEFORCE GT 555M CUDA 1GB Dual Boot System with win7 x64Bit Medion P6812 Laptop Here is the message output: Usally I got only a half of the screen filled with messages like that. When I filmed it it was much more. I dont know, maybe this is a Bug.

    Read the article

  • Stuck while Booting Text Mode in Ubuntu 12.04?

    - by sameetandpotatoes
    I edited the /etc/default/grub file and changed: GRUB_CMDLINE_LINUX_DEFAULT=”quiet splash” to GRUB_CMDLINE_LINUX_DEFAULT=”text” This does make Ubuntu boot up into text mode; however, it gets stuck while booting up and it does not show me the login text. Instead it says something like this: Begin running /scripts/init-bottom USB hub found ...More irrelevant things... *Stopping LightDM Display Manager I can press Ctrl + Alt + F2 and see the login and boot up like normal, but is there any reason for this? How can I change it so it does not get get stuck? Edit: After 5 minutes, a new line came up: 557.1206341 ieee80211 phy0: channel change: 5540 -> 5560 failed (3).

    Read the article

  • Has MyEclipse implicit breakpoint in debugging mode in class URLClassPath [migrated]

    - by MJM
    I am beginner in MyEclipse IDEA. I using 8.6.1 version of it. My issue is: When I execute my program in debug mode, MyEclipse go to sun.misc.URLClassPath class and I must Resume breakpoint(by pressing F8 key) and continue executing my program. MyEclipse stay in URLClassPath class in following thread stack: 1. URLClassPath$JarLoader.<init>(URL, URLStreamHandler, HashMap) line: 581 2. URLClassPath$JarLoader.ensureOpen() line: 631 3. URLClassPath$JarLoader.getJarFile(URL) line: 641 4. URLClassPath$JarLoader.ensureOpen() line: 631 Note: this event happen when some jar file exist in my project Build-Path but when my application is simple this problem don't make and first breakpoint is my first breakpoint. Why this event happened?

    Read the article

  • How does btrfs RAID work in degraded mode?

    - by turbo
    My idea was that (using loopback devices) it works like this Create the raid array sudo mkfs.btrfs -m raid1 -d raid1 /dev/loop1 /dev/loop2 You mount them sudo mount /dev/loop1 /mnt and mark them touch goodcondition You unmount and simulate disk failure (remove disk or delete loopback device loop2 in my case) You mount degraded -o degraded and mark again touch degraded You add the bad disk again sudo btrfs dev add /dev/loop2 You rebalance sudo btrfs fi ba /mnt And Raid 1 should work again. But that's not the case. sudo btrfs fi show: Total devices 3 FS bytes used 28.00KB devid 3 size 4.00GB used 264.00MB path /dev/loop1 devid 2 size 4.00GB used 272.00MB path /dev/loop2 *** Some devices missing The file degraded lives on loop1 but not on loop2 when loop2 is mounted in degraded mode. Why is that?

    Read the article

  • Particle and Physics problem.

    - by Quincy
    This was originally a forum post so I hope you guys don't mind it being 2 questions in one. I am making a game and I got some basic physics implemented. I have 2 problems, 1 with particles being drawn in the wrong place and one with going through walls while jumping in corners. Skip over to about 15 sec video showing the 2 problems : http://youtube.com/watch?v=Tm9nfWsWfiM So the problem with the particles seems to be coming from the removal, as soon as I remove that piece of code it instantly works, but there shouldn't be a problem since they shouldn't even draw when their energy gets to 0 (and then they get removed) So my first question is, how are these particles getting warped all over the screen ? Relevant code : Particle class : class Particle { //Physics public Vector2 position = new Vector2(0,0); public float direction = 180; public float speed = 100; public float energy = 1; protected float startEnergy = 1; //Visual public Sprite sprite; public float rotation = 0; public float scale = 1; public byte alpha = 255; public BlendMode blendMode { get { return sprite.BlendMode; } set { sprite.BlendMode = value; } } public Particle() { } public virtual void Think(float frameTime) { if (energy - frameTime < 0) energy = 0; else energy -= frameTime; position += new Vector2((float)Math.Cos(MathHelper.DegToRad(direction)), (float)Math.Sin(MathHelper.DegToRad(direction))) * speed * frameTime; alpha = (byte)(255 * energy / startEnergy); sprite.Rotation = rotation; sprite.Position = position; sprite.Color = new Color(sprite.Color.R, sprite.Color.G, sprite.Color.B, alpha); } public virtual void Draw(float frameTime) { if (energy > 0) { World.camera.DrawSprite(sprite); } } // Basic particle implementation class BasicSprite : Particle { public BasicSprite(Sprite _sprite) { sprite = _sprite; } } Emitter : class Emitter { protected static Random rand = new Random(); protected List<Particle> particles = new List<Particle>(); public BaseEntity target = null; public Vector2 position = new Vector2(0, 0); public bool Active = true; public float timeAlive = 0; public int particleCount = 0; public int ParticlesPerSeccond { get { return (int)(1 / particleSpawnTime); } set { particleSpawnTime = 1 / (float)value; } } public float dieTime = float.MaxValue; float particleSpawnTime = 0.05f; float spawnTime = 0; public Emitter() { } public virtual void Think(float frametime) { spawnTime += frametime; if (dieTime != float.MaxValue) { timeAlive += frametime; if (timeAlive >= dieTime) Active = false; } if (Active) { if (target != null) position = target.Position; while (spawnTime > particleSpawnTime) { spawnTime -= particleSpawnTime; AddParticle(); particleCount++; } } for (int i = 0; i < particles.Count; i++) { particles[i].Think(frametime); if (particles[i].energy <= 0) { particles.Remove(particles[i]); // As soon as this is removed, it works particleCount--; } } } public virtual void AddParticle() { } public virtual void Draw(float frametime) { foreach (Particle particle in particles) { particle.Draw(frametime); } } } class BloodEmitter : Emitter { Image image; public BloodEmitter() { image = new Image(@"Content/Particles/TinyCircle.png"); image.CreateMaskFromColor(new Color(255, 0, 255, 255)); this.dieTime = 0.5f; this.ParticlesPerSeccond = 100; } public override void AddParticle() { Sprite sprite = new Sprite(image); sprite.Color = new Color((byte)(rand.NextDouble() * 255), (byte)(rand.NextDouble() * 255), (byte)(rand.NextDouble() * 255)); BasicSprite particle = new BasicSprite(sprite); particle.direction = (float)rand.NextDouble() * 360; particle.position = position; particle.blendMode = BlendMode.Alpha; particles.Add(particle); } } The seccond problem is the physics problem, for some reason I can get through the right bottom corner while jumping. I think this is coming from me switching animations but I thought I made it compensate for that. Relevant code : PhysicsEntity : class PhysicsEntity : BaseEntity { // Horizontal movement constants protected const float maxHorizontalSpeed = 1000; protected const float horizontalAcceleration = 15; protected const float horizontalDragAir = 0.95f; protected const float horizontalDragGround = 0.95f; // Vertical movement constants protected const float maxVerticalSpeed = 1000; protected const float verticalAcceleration = 20; // Everything needed for movement and correct animations protected float movement = 0; protected bool onGround = false; protected Vector2 Velocity = new Vector2(0, 0); protected float maxSpeed = 0; float lastThink = 0; float thinkTime = 1f/60f; public PhysicsEntity(Vector2 position, Sprite sprite) : base(position, sprite) { } public override void Draw(float frameTime) { base.Draw(frameTime); } public override void Think(float frameTime) { CalculateMovement(frameTime); base.Think(frameTime); } protected void CalculateMovement(float frameTime) { lastThink += frameTime; while (lastThink > thinkTime) { onGround = false; Velocity.X = MathHelper.Clamp(Velocity.X + horizontalAcceleration * movement, -maxHorizontalSpeed, maxHorizontalSpeed); if (onGround) Velocity.X *= horizontalDragGround; else Velocity.X *= horizontalDragAir; if (maxSpeed < Velocity.X) maxSpeed = Velocity.X; Velocity.Y = MathHelper.Clamp(Velocity.Y + verticalAcceleration, -maxVerticalSpeed, maxVerticalSpeed); lastThink -= thinkTime; DoCollisions(thinkTime); DoAnimations(thinkTime); } } public virtual void DoAnimations(float frameTime) { } public void DoCollisions(float frameTime) { Position.Y += Velocity.Y * frameTime; Vector2 tileCollision = GetTileCollision(); if (tileCollision.X != -1 || tileCollision.Y != -1) { Vector2 collisionDepth = CollisionRectangle.DepthIntersection( new Rectangle( tileCollision.X * World.tileEngine.TileWidth, tileCollision.Y * World.tileEngine.TileHeight, World.tileEngine.TileWidth, World.tileEngine.TileHeight ) ); Position.Y += collisionDepth.Y; if (collisionDepth.Y < 0) onGround = true; Velocity.Y = 0; } Position.X += Velocity.X * frameTime; tileCollision = GetTileCollision(); if (tileCollision.X != -1 || tileCollision.Y != -1) { Vector2 collisionDepth = CollisionRectangle.DepthIntersection( new Rectangle( tileCollision.X * World.tileEngine.TileWidth, tileCollision.Y * World.tileEngine.TileHeight, World.tileEngine.TileWidth, World.tileEngine.TileHeight ) ); Position.X += collisionDepth.X; Velocity.X = 0; } } public void DoCollisions(Vector2 difference) { CollisionRectangle.Y = Position.Y - difference.Y; CollisionRectangle.Height += difference.Y; Vector2 tileCollision = GetTileCollision(); if (tileCollision.X != -1 || tileCollision.Y != -1) { Vector2 collisionDepth = CollisionRectangle.DepthIntersection( new Rectangle( tileCollision.X * World.tileEngine.TileWidth, tileCollision.Y * World.tileEngine.TileHeight, World.tileEngine.TileWidth, World.tileEngine.TileHeight ) ); Position.Y += collisionDepth.Y; if (collisionDepth.Y < 0) onGround = true; Velocity.Y = 0; } CollisionRectangle.X = Position.X - difference.X; CollisionRectangle.Width += difference.X; tileCollision = GetTileCollision(); if (tileCollision.X != -1 || tileCollision.Y != -1) { Vector2 collisionDepth = CollisionRectangle.DepthIntersection( new Rectangle( tileCollision.X * World.tileEngine.TileWidth, tileCollision.Y * World.tileEngine.TileHeight, World.tileEngine.TileWidth, World.tileEngine.TileHeight ) ); Position.X += collisionDepth.X; Velocity.X = 0; } } Vector2 GetTileCollision() { int topLeftTileX = (int)(CollisionRectangle.TopLeft.X / World.tileEngine.TileWidth); int topLeftTileY = (int)(CollisionRectangle.TopLeft.Y / World.tileEngine.TileHeight); int BottomRightTileX = (int)(CollisionRectangle.DownRight.X / World.tileEngine.TileWidth); int BottomRightTileY = (int)(CollisionRectangle.DownRight.Y / World.tileEngine.TileHeight); if (CollisionRectangle.DownRight.Y % World.tileEngine.TileHeight == 0) // If your exactly against the tile don't count that as being inside the tile BottomRightTileY -= 1; if (CollisionRectangle.DownRight.X % World.tileEngine.TileWidth == 0) // If your exactly against the tile don't count that as being inside the tile BottomRightTileX -= 1; for (int i = topLeftTileX; i <= BottomRightTileX; i++) { for (int j = topLeftTileY; j <= BottomRightTileY; j++) { if (World.tileEngine.TileIsSolid(i, j)) { return new Vector2(i, j); } } } return new Vector2(-1, -1); } } Player : enum State { Standing, Running, Jumping, Falling, Sliding, WallSlide } class Player : PhysicsEntity { private State state { get { return currentState; } set { if (currentState != value) { currentState = value; animationChanged = true; } } } private State currentState = State.Standing; private BasicEmitter basicEmitter = new BasicEmitter(); public bool flipped; public bool animationChanged = false; protected const float jumpPower = 600; AnimationManager animationManager; Rectangle DrawRectangle; public override Rectangle CollisionRectangle { get { return new Rectangle( Position.X - DrawRectangle.Width / 2f, Position.Y - DrawRectangle.Height / 2f, DrawRectangle.Width, DrawRectangle.Height ); } } public Player(Vector2 position, Sprite sprite) : base(position, sprite) { // Only posted the relevant bit DrawRectangle = animationManager.currentAnimation.drawingRectangle; } public override void Draw(float frameTime) { World.camera.DrawSprite( Sprite, Position + new Vector2(DrawRectangle.X, DrawRectangle.Y), animationManager.currentAnimation.drawingRectangle ); } public override void Think(float frameTime) { //I only posted the relevant stuff if (animationChanged) { // if the animation has changed make sure we compensate for the change in with and height animationChanged = false; DoCollisions(animationManager.getSizeDifference()); } DoCustomMovement(); base.Think(frameTime); if (!onGround && Velocity.Y > 0) { state = State.Falling; } } void DoCustomMovement() { if (onGround) { if (World.renderWindow.Input.IsKeyDown(KeyCode.W)) { Velocity.Y = -jumpPower; state = State.Jumping; } } } public override void DoAnimations(float frameTime) { string stateName = Enum.GetName(typeof(State), state); if (!animationManager.currentAnimationIs(stateName)) { animationManager.PlayAnimation(stateName); } animationManager.Think(frameTime); DrawRectangle = animationManager.currentAnimation.drawingRectangle; Sprite.Center = new Vector2( DrawRectangle.X + DrawRectangle.Width / 2, DrawRectangle.Y + DrawRectangle.Height / 2 ); Sprite.FlipX(flipped); } So why am I warping through walls ? I have given this some thought but I just can't seem to find out why this is happening. Full source if needed : source : http://www.mediafire.com/?rc7ddo09gnr68zd (download link)

    Read the article

  • XNA frame rate spikes in full screen mode

    - by ProgrammerAtWork
    I'm loading a simple texture and rotating it in XNA, and this works. But when I run it in full screen 1920x1080 mode I see spikes while my texture is rotating. If I run it windowed with 1920x1080 resolution, I don't get the spikes. The size of the texture does not seem to matter, I tried 512 texture size and 2048 texture size, same thing happens. Spikes in full screen, no spikes in windowed, resolution does not seem to matter, Debug or Release does not seem to do anything either. Anyone got ideas of what could be the problem? Edit: I think this problem has something to do with the vertical retrace. Set this property: _graphicsDeviceManager.SynchronizeWithVerticalRetrace = false; you'll lose vsync but it will not stutter.

    Read the article

  • Can't disable giant cursors (from accessibility mode)

    - by jackweirdy
    I've just installed ubuntu 12.04 from a livecd. Out of curiosity, I enabled the accessibility options for people who are hard of sight. As you can guess this does the usual stuff of inverting colours, increasing text size and making the cursor larger. Having finished the installation I booted into the new system to find accessibility mode was still installed. From the lightdm login screen I disabled this which switched colours and text size back to default, however it's only the pointer cursor that has gone back to default. To put it another way, the "hand" icon that you get when hovering over a link, the cursor which appears when typing and pretty much every other cursor on the system are still large. I've looked on the Universal Access menu, but there's no option to disable large cursors. I've tried toggling accessibility on and off but to no avail.

    Read the article

  • Does Ubuntu support SATA drives in AHCI mode?

    - by timelessbeing
    I have a current installation of Ubuntu 13. Will it boot if I switch my SATA controller to AHCI in BIOS? (I installed Ubuntu in IDE mode) I have to wait until I fix my GRUB (Windows ate it), so I thought I'd take a poll here first in case there are any precautions. I ask, because it was a royal PITA to do it in Windows. Will I need to reinstall Ubuntu to enable this? I don't mind doing that since it was just installed and I having nothing on it yet, and I kinda botched the install anyway.

    Read the article

  • Why is Claws Mail starting in offline mode?

    - by Thanks
    I am sorry if this is not the best place to ask this, but hopefully someone might be able to help. After I upgraded to Ubuntu 11.10 from 11.04, Claws Mail 3.7.10 began starting in offline mode. It continues to do so upon every startup. I have searched and searched but cannot find any option that might affect this. Am I overlooking the option? Is this a bug with the current versions of Claws Mail and Ubuntu? Or is there some other way to fix this? Thank you.

    Read the article

  • glx not working, optirun works, gnome in fallback mode

    - by user26766
    It all started when I installed nvidia's own driver. Uninstalling it and reverting back to nvidia-current didn't solve the problem, so I have been playing with this for a while. Now nvidia-current seems to be functional. It seems glx support is missing, and my intel graphics is not responding. gnome loads only in fallback mode. Here are some outputs: glxinfo name of display: :0.0 Error: couldn't find RGB GLX visual or fbconfig glxgears Error: couldn't get an RGB, Double-buffered visual optirun glxgears works fine lspci | grep VGA 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 01:00.0 VGA compatible controller: NVIDIA Corporation GF108 [GeForce GT 540M] (rev ff) how can I fix this? thanks,

    Read the article

< Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >