Search Results

Search found 2134 results on 86 pages for 'ansi colors'.

Page 67/86 | < Previous Page | 63 64 65 66 67 68 69 70 71 72 73 74  | Next Page >

  • Numbering grouped data in Excel

    - by Jeff
    I have an Excel spreadsheet (2010) with data similar to this: Dogs Brown Nice Dogs White Nice Dogs White Moody Cats Black Nice Cats Black Mean Cats White Nice Cats White Mean I want to group these animals but I only care about species and color. I don't care about disposition. I want to assign group numbers to the set as shown here. 1 Dogs Brown Nice 2 Dogs White Nice 2 Dogs White Moody 3 Cats Black Nice 3 Cats Black Mean 4 Cats White Nice 4 Cats White Mean I was able to select all the species and colors, then from the data tab select 'advanced', then 'unique records only'. This collapsed the data so that I could number the visible rows. Then when I 'cleared' the filter I could easily just fill the blank areas under the numbers with the number above. The problem is that my real data has far too many rows for this to be practical. Also, the trick about entering 1 in the first cell, 2 in the cell below, selecting both then dragging the corner down to 'auto-number' doesn't seem to work when you're viewing filtered rows. Any way to do this?

    Read the article

  • Keeping Xv Overlay configuration throughout an X session.

    - by kriss
    After upgrading my Linux system from Ubuntu 9.04 to Ubuntu 10.10, I suceeded correcting most problems (all related to Intel 82865G Integrated Graphics Adapter support and compiz is still not working but that's another matter) but for one I only have a partial solution. Whenever I play a video the colors are much too saturated. This is really a problem for tones of skins that appears reddish (everyone seems to be coming back from a ski vacation with deep sun burns). As this effect only occurs with videos, not with pictures, I finally figured out it was related to Video Overlays configuration and I can correct it typing: xvattr -a XV_SATURATION -v 120 This change the default saturation value, which is 500 and much too high in my case, at eye sight the correct value seems to be between 100 and 150. Now my problem is that I have to type the above command each time I run a video. If I type it before running the video it has no effect, if I close the video and open a new one, I have to type it again, etc. I tried to put it in Xsession and (logically) it has no effect either. How could I do to get the correct setting whenever I run a video without typing the above command every time ?

    Read the article

  • Is it possible to have a conditional formatting cell "visually cycle" through all the formats that evaluated true?

    - by Ben
    Like the title says, "In Excel, when a cell has multiple conditional formatting rules that evaluate true, is it possible to have the cell "visually cycle" through all the formats that evaluated true? If not, suggestions on what to do would be appreciated!" I'm creating an employee schedule for a business that has multiple job areas that need to have an employee assigned to cover. The schedule is currently set up with the date on the top row, employee list down the left column, and the employee's assigned "job area" cross-referencing with the date on the top row. Originally it was set up where if every required "job area" didn't have someone assigned to it, the date would (via conditional formatting) change to red. I've set it up now that if a condition isn't met, the date will change to the color of the "job area" that doesn't have an employee assigned to it. However, there are cases where multiple job areas don't have an employee assigned, but the date will only change color based on the first condition that isn't met. It'd be nice if there was some way for the date cell to cycle through the different colors that correspond to the job areas where no one is assigned. I have a hunch that's not possible though. If it is possible, I'd love to know how to do it. And if it isn't, if anyone has any suggestions on how I can modify the Excel sheet to make it easier to identify the job areas that don't have anyone assigned to them, I would appreciate it. FYI This schedule goes out months in advance.

    Read the article

  • Is having a [high-end] video card important on a server?

    - by Patrick
    My application is quite interactive application with lots of colors and drag-and-drop functionality, but no fancy 3D-stuff or animations or video, so I only used plain GDI (no GDI Plus, No DirectX). In the past my applications ran in desktops or laptops, and I suggested my customers to invest in a decent video card, with: a minimum resolution of 1280x1024 a minimum color depth of 24 pixels X Megabytes of memory on the video card Now my users are switching more and more to terminal servers, therefore my question: What is the importance of a video card on a terminal server? Is a video card needed anyway on the terminal server? If it is, is the resolution of the remote desktop client limited to the resolutions supported by the video card on the server? Can the choice of a video card in the server influence the performance of the applications running on the terminal server (but shown on a desktop PC)? If I start to make use of graphical libraries (like Qt) or things like DirectX, will this then have an influence on the choice of video card on the terminal server? Are calculations in that case 'offloaded' to the video card? Even on the terminal server? Thanks.

    Read the article

  • Can I use @import to import Kod's default style sheet into my own?

    - by Thomas Upton
    I understand that Kod is being actively developed and is prone to drastic changes in any area. I would like to modify some small things (like font face and size or certain colors) while still being able to benefit from any changes or updates to the default Kod stylesheet. I thought that I would be able to @import the default stylesheet into my own to achieve this. This is what ~/.kod/custom.css would look like, @import url("file:///Applications/Kod.app/Contents/Resources/style/default.css"); /* Change the default font face and color. */ body { font-family: Menlo, monospace; color: #efefef; } This stylesheet was set with the following defaults command, per the comments at the top of Kod's default CSS file: defaults write se.hunch.kod style/url ~/.kod/custom.css Unfortunately, this didn't work. When I first tried to reload the style, Kod crashed. It opened fine again, but the @import statement wasn't working, and Kod crashed every time I saved the custom.css file. Am I doing something wrong? Did I write my @import statement wrong? Is that not how @import is supposed to work? Did I miss some sort of documentation or Kod Google Groups post that mentions that Kod explicitly disallows this?

    Read the article

  • Lookup Multiple Results for Multiple Criteria

    - by Matt
    I've got a list of parent SKUs for items I need to create in my inventory system. This list has been finely paired down to the 165 products we would like to carry. However, each one of these 165 SKUs has between 2 and 8 child SKUs of different colors, sizes, etc. Those are stored on a different worksheet, mixed into around 2500 items. Those are the SKUs I need to input into my inventory system. Here is what it looks like. Sheet 1 is just SKUs: A 1 2 3 4 Sheet 2 is comprised of all the child SKUs, with parent SKUs in column B. Not all parents have the same number of children: A B 1BLKM 1 1BLKL 1 1BLUM 1 2BLKM 2 2BLKL 2 2BLUM 2 2ORAM 2 3BLKM 3 3BLUM 3 I want to look up all of the child SKUs for the Parent SKU list that has been fine tuned. Parent SKU is included as a column on the child SKU worksheet. I need to lookup all matches of the Parent SKU, then continue to move down the parent SKU list until all matches for all 165 parent items have been found. It seems like every function I try can't use an Array for input. Is there a way to do this with Lookup or some combination of index, match, row, etc? Any way at all to do it without VBA? Or maybe even a VBA solution with code that I can understand, as someone who hasn't used VBA before.

    Read the article

  • Acer recovery disks not bootable?

    - by user13743
    We got a new Acer laptop with Vista installed at work. As it's getting ready to go out in the field, we wanted to do a burn-in test on it. We made the recovery DVDs before we ran the test. Part of the burn-in was bonnie++, which does a destructive read/write test of the hard drive. The machine passed with flying colors, but after trying to boot to the recovery DVD to being re-installing the system, the machine began to try PXE boot after a while. After doing some googling, it appears these 'recovery' disks expect a certain recovery partition to exist on the hard drive, and are in fact not bootable at all, and are useless in absence of the recovery partition. Is this the case, and is this "The Way Things Are" with all PC manufacturers and Windows Vista+ nowadays? How do I get my hands on actual bootable DVDs? I've emailed Acer support. I see an option on their site to purchase recovery disks, but I have the suspicion that these are the same non-bootable disks that I burned on the new system. Will Acer provide actual boot disks?

    Read the article

  • Photoshop CS4. How do I make sure my color stays the same in my different .psd files? (could be RGB

    - by Kris
    I asked 2 photoshop experts I know but they haven't got a clue because all my settings are exactly the same, in both files. (except the RGB type !! I'm not sure. Please read on) I use RGB color, 72DPI, 8 bits / channel. No adjustments (filters, like greyscale, etc ...) are selected / used. The layers are both normal, and opacity and fill are 100% (yes, in both files). I took two screenshots, and you can see the difference: http://www.flickr.com/photos/30465871@N05/4623864297/ http://www.flickr.com/photos/30465871@N05/4624469754/ Both colors are ff795d, but that doesn't matter, any color I use gives me the same problem: they both look different. Now, I know the CMYK settings (see screenshots) are different, but when the settings are the same the color changes. Why is this happening and how do I solve this problem? My guess is I'm working with different a type of RGB. It's sRGB IE61966 - 2.1 in the file I created (file info raw data) but I can't find that in the file that started with a screenshot. If that's the problem, how do I change / convert the RGB, once the file is already open? Thank you.

    Read the article

  • How to deal with Unicode strings in C/C++ in a cross-platform friendly way?

    - by Sorin Sbarnea
    On platforms different than Windows you could easily use char * strings and treat them as UTF-8. The problem is that on Windows you are required to accept and send messages using wchar* strings (W). If you'll use the ANSI functions (A) you will not support Unicode. So if you want to write truly portable application you need to compile it as Unicode on Windows. Now, In order to keep the code clean I would like to see what is the recommended way of dealing with strings, a way that minimize ugliness in the code. Type of strings you may need: std::string, std::wstring, std::tstring,char *,wchat_t *, TCHAR*, CString (ATL one). Issues you may encounter: cout/cerr/cin and their Unicode variants wcout,wcerr,wcin all renamed wide string functions and their TCHAR macros - like strcmp, wcscmp and _tcscmp. constant strings inside code, with TCHAR you will have to fill your code with _T() macros. What approach do you see as being best? (examples are welcome) Personally I would go for a std::tstring approach but I would like to see how would do to the conversions where they are necessary.

    Read the article

  • Failure remediation strategy for File I/O

    - by Brett
    I'm doing buffered IO into a file, both read and write. I'm using fopen(), fseeko(), standard ANSI C file I/O functions. In all cases, I'm writing to a standard local file on a disk. How often do these file I/O operations fail, and what should the strategy be for failures? I'm not exactly looking for stats, but I'm looking for a general purpose statement on how far I should go to handle error conditions. For instance, I think everyone recognizes that malloc() could and probably will fail someday on some user's machine and the developer should check for a NULL being returned, but there is no great remediation strategy since it probably means the system is out of memory. At least, this seems to be the approach taken with malloc() on desktop systems, embedded systems are different. Likewise, is it worth reattempting a file I/O operation, or should I just consider a failure to be basically unrecoverable, etc. I would appreciate some code samples demonstrating proper usage, or a library guide reference that indicates how this is to be handled. Any other data is, of course, welcome.

    Read the article

  • Connecting a laptop to a TV via HDMI

    - by Madmartigan
    I just bought a new Dell XPS17 laptop (Win7) that only has HDMI output. My last 2 laptops had VGA, which I used to connect to my Sony Bravia 32" TV with no issues, but with the HDMI it's been quite a headache. Drivers for display adapters have been updated to the latest versions: Intel(R) HD Graphics Family NVIDIA GeForce GT 550M I went to a store and plugged in to 4 different TVs from different manufacturers. A sales rep and I spent about 30 minutes being baffled by the results (which are the same as my current TV): Extreme buggy behavior in the Nvidia and Windows display/resolution control panel Can not extend or duplicate displays, can only select one Third and fourth output devices "randomly" detected by the Windows control panel Could not get the screen to fit the output (edges cut off on all sides by about a half inch) Resolution and colors less than perfect. Artifacts around text. Display "randomly" cuts out Defaults to TV output only when plugged in Can not change resolution on either device when connected No audio from the TV Plugged in to 3 monitors from different manufacturers: Defaults to duplicated displays when plugged in Everything works perfectly So far, four people have gone through all the settings in the latop with no luck. I had similar, but not exactly matching results with a different laptop. I'm using the Sony Bravia currently at home, but in order to get it to work I have to turn on the laptop, wait until the display shows up on it, close the lid, then cycle through each output channel on the TV until I come back around to the HDMI port again, but still I have the symptoms described above. However: Once in a while, it just works. Sometimes, seemingly randomly, the output fits the screen perfectly. Sometimes the audio comes through the speakers too, but not always. Usually my screen saver "Mystify" will come up with a message that it cannot be displayed due to a limitation of the video card, but then sometimes it works fine. These 3 things seem to be independent of each other and don't always happen together. So, is there any way to get the laptop to output correctly to a TV, or is it just not meant to be?

    Read the article

  • program won't find math.h anymore

    - by 130490868091234
    After a long time, I downloaded a program I co-developed and tried to recompile it on my Ubuntu Linux 12.04, but it seems it does not find math.h anymore. This may be because something has changed recently in gcc, but I can't figure out if it's something wrong in src/Makefile.am or a missing dependency: Download from http://www.ub.edu/softevol/variscan/: tar xzf variscan-2.0.2.tar.gz cd variscan-2.0.2/ make distclean sh ./autogen.sh make I get: [...] gcc -DNDEBUG -O3 -W -Wall -ansi -pedantic -lm -o variscan variscan.o statistics.o common.o linefile.o memalloc.o dlist.o errabort.o dystring.o intExp.o kxTok.o pop.o window.o free.o output.o readphylip.o readaxt.o readmga.o readmaf.o readhapmap.o readxmfa.o readmav.o ran1.o swcolumn.o swnet.o swpoly.o swref.o statistics.o: In function `calculate_Fu_and_Li_D': statistics.c:(.text+0x497): undefined reference to `sqrt' statistics.o: In function `calculate_Fu_and_Li_F': statistics.c:(.text+0x569): undefined reference to `sqrt' statistics.o: In function `calculate_Fu_and_Li_D_star': statistics.c:(.text+0x63b): undefined reference to `sqrt' statistics.o: In function `calculate_Fu_and_Li_F_star': statistics.c:(.text+0x75c): undefined reference to `sqrt' statistics.o: In function `calculate_Tajima_D': statistics.c:(.text+0x85d): undefined reference to `sqrt' statistics.o:statistics.c:(.text+0xcb1): more undefined references to `sqrt' follow statistics.o: In function `calcRunMode21Stats': statistics.c:(.text+0xe02): undefined reference to `log' statistics.o: In function `correctedDivergence': statistics.c:(.text+0xe5a): undefined reference to `log' statistics.o: In function `calcRunMode22Stats': statistics.c:(.text+0x104a): undefined reference to `sqrt' statistics.o: In function `calculate_Fu_fs': statistics.c:(.text+0x11a8): undefined reference to `fabsl' statistics.c:(.text+0x11ca): undefined reference to `powl' statistics.c:(.text+0x11f2): undefined reference to `logl' statistics.o: In function `calculateStatistics': statistics.c:(.text+0x13f2): undefined reference to `log' collect2: ld returned 1 exit status make[1]: *** [variscan] Error 1 make[1]: Leaving directory `/home/avilella/variscan/latest/variscan-2.0.2/src' make: *** [all-recursive] Error 1 The libraries are there because this simple example works perfectly well: $ gcc test.c -o test -lm $ cat test.c #include <stdio.h> #include <math.h> int main(void) { double x = 0.5; double result = sqrt(x); printf("The hyperbolic cosine of %lf is %lf\n", x, result); return 0; } Any ideas?

    Read the article

  • RDP Connection to Windows 7 stays really slow

    - by Pavlo
    I have an Issue with connecting to Windows 7 via RDP. I can open an RDP Session, but regardless of any settings, the responce times are really long. This in particulary is the case when opening a web page in a browser. I've tried IE, Firefox and Google Chrome. I also use RDP connection to a Windows 2008 Server from the same client machine, and the speed is very normal with all features turned on. We have Gigabit Ethernet here. So I think it can not be the client's fault. What concerns Windows 7 Machine, I've tried shutting all the sraphic features off and turning the color levels to 256 colors. Result - the same. If I work locally on the machine - I can not see any lags. What else have I tried: Using old RDP 5 Client from Microsoft Setting network autotuninglevel as seen here Do You have some ideas? Thanks in advance! Update the problem seems to be with rendering window contents. All the window borders and pannes are rendered pretty quickly, but the content shows up very slowly. Also mouse movements are recognised by the Win 7 box only after some period. Are there some hidden settings in the RDP, where one could turn some advanced features off or turn some caching on? I use Bitmap Caching, but this apparently doesn't help.

    Read the article

  • how to diagnose a hard system seizure? Dell+Ubuntu

    - by rob
    I've got Ubuntu 9.10 on a Dell Vostro 420 desktop, a little over a year old, which I use for plain vanilla work stuff (email, web, terminal, text editor). Every now and then, at totally random times, it completely freezes on me. Hard. Mouse and keyboard stop working, cursor stops blinking, clock stops moving. All I can do is hold down the power button on the front of the box to shut it off. Sometimes it happens after several months of continuous uptime; sometimes it happens a few minutes after a reboot, while all I've done is open a terminal to look at log files, or maybe firefox to do a google search. Each time, there is nothing at all in /var/log/messages at the time of the crash. This makes it seem like a hardware problem, and indeed a few months ago I opened the box and wiggled everything and the problem went away for a while. But now it's back. I went in and checked everything, took out each RAM card and reseated. No luck. I ran all the system diagnostics (the long version) and everything passed with flying colors. Something is messed up in this box, but without any useful logs or failed tests, how in the world am I going to find it? And of course, Dell's not gonna help me cause I went and replaced Windows with Ubuntu. What steps would you take next to track down this problem?

    Read the article

  • How I can fix the "Display Driver has stopped responding and has recovered"?

    - by Vitor Rangel
    I'm using a GeForce GTX 580, with Windows 7 64-bit. The driver version of the GTX is 301.42. The problem happens after a few minutes, when I'm playing specifc games. It won't happen in all games - And I don't have any idea why these games doesn't work. The games that doesn't work: Battlefield 3, Civilization V, Sniper Elite V2. The games that work: Mass Effect 3, Crysis 2, Team Fortress 2, Left 4 Dead 2, Skyrim, L.A. Noire. As you can see, it's not a problem of "The games that demand more stop working". I've tryed updating the driver of the graphics-card, the bios of the motherboard, even formated my computer (It was needing it) and instaled every driver in the last version possible. This problem happens since I bought my graphics-card, 6 months ago. After a few minutes, from 10 to 20, the pixels in the monitor become strange, with random colors and effects, like it was broken. Then, everything goes black, and the message appears "Display Driver has stopped responding and has recovered". After that, I need to close the game and start again. I am not overclocking, and my temperature never goes higher than 70ºC.

    Read the article

  • Why do moving lines become fuzzy on my monitor?

    - by CodeInChaos
    I recently got a new notebook. With moving images there are some graphical issues, and I'd like to know what causes them. None of my earlier monitors exhibited similar issues. Moving high contrast lines become jagged, similar to interleaved videos. When moving a horizontal line vertically those artifacts are colored, when moving a vertical line horizontally they aren't colored. The effect isn't observable in static images. And when moving faster the zone in which it occurs becomes wider. The effect is very visible if I move a window around on the borders of the window and wherever high contrast lines appear. But it appears when watching videos too. The vertical line in that image moves to the right, the horizontal line upwards. The effect is most likely related to the fact that each real pixel consists of different sub-pixels for the different color channels. But how are these causing the observed effect? Is the change at which the different colors change to the destination brightness different? The optical impression is that every second pixel in a chess board like arrangement is adapting slower than it's neighbors. But that doesn't make much sense.

    Read the article

  • Multiple Settings for One Monitor

    - by Lynda
    I have my monitor set to the way I like it, it shows colors accurately and not overly bright (white doesn't blind me). However I edit photos for the web and this is great for my screen but I have noticed when looking at other screens my photos will at times looked washed out and not a good presentation. If I manually change my monitor settings to be brighter (and a few other tweaks) I can replicate the look of other monitors close enough to help prevent washed out/blown out photos. Here is the issue, I need to be able to change settings between the two fairly easily. To manually set my monitor for one then back to the other can be time consuming and a pain to set right every time. I would like to be able to preset certain settings (in software) that I can quickly switch between by pressing hotkeys or clicking shortcut on my desktop. I have an AMD video card and have attempted to use the AMD VISION Engine Control Center to add presets then tweak the settings for each. However when I switch between the presets the settings are the same. How do I build presets using the VISION Engine Control Center that actually works (maybe I misunderstand what it is saving?) or is there a way to do this in Windows 7 or free software that actually works decently for this purpose?

    Read the article

  • Can isdigit legitimately be locale dependent in C

    - by cdev
    In the section covering setlocale, the ANSI C standard states in a footnote that the only ctype.h functions whose behaviour is not affected by the current locale are isdigit and isxdigit. The Microsoft implementation of isdigit is locale dependent because, for example, in locales using code page 1250 isdigit only returns non-zero for characters in the range 0x30 ('0') - 0x39 ('9'), whereas in locales using code page 1252 isdigit also returns non-zero for the superscript digits 0xB2 ('²'), 0xB3 ('³') and 0xB9 ('¹'). Is Microsoft in violation of the C standard by making isdigit locale dependent? In this question I am primarily interested in C90, which Microsoft claims to conform to, rather than C99. Additional background: Microsoft's own documentation of setlocale incorrectly states that isdigit is unaffected by the LC_CTYPE part of the locale. The section of the C standard that covers the ctype.h functions contains some wording that I consider ambiguous: "The behavior of these functions is affected by the current locale. Those functions that have locale-specific aspects only when not in the "C" locale are noted below." I consider this ambiguous because it is unclear what it is trying to say about functions such as isdigit for which there are no notes about locale-specific aspects. It might be trying to say that such functions must be assumed to be locale dependent, in which case Microsoft's implementation of isdigit would be OK. (Except that the footnote I mentioned earlier seems to contradict this interpretation.)

    Read the article

  • Calling from C# to C function which accept a struct array allocated by caller

    - by lifey
    I have the following C struct struct XYZ { void *a; char fn[MAX_FN]; unsigned long l; unsigned long o; }; And I want to call the following function from C#: extern "C" int func(int handle, int *numEntries, XYZ *xyzTbl); Where xyzTbl is an array of XYZ of size numEntires which is allocated by the caller I have defined the following C# struct: [System.Runtime.InteropServices.StructLayoutAttribute(System.Runtime.InteropServices.LayoutKind.Sequential, CharSet = System.Runtime.InteropServices.CharSet.Ansi)] public struct XYZ { public System.IntPtr rva; [System.Runtime.InteropServices.MarshalAsAttribute(System.Runtime.InteropServices.UnmanagedType.ByValTStr, SizeConst = 128)] public string fn; public uint l; public uint o; } and a method: [System.Runtime.InteropServices.DllImport(@"xyzdll.dll", CallingConvention = CallingConvention.Cdecl)] public static extern Int32 func(Int32 handle, ref Int32 numntries, [MarshalAs(UnmanagedType.LPArray)] XYZ[] arr); Then I try to call the function : XYZ xyz = new XYZ[numEntries]; for (...) xyz[i] = new XYZ(); func(handle,numEntries,xyz); Of course it does not work. Can someone shed light on what I am doing wrong ?

    Read the article

  • How to fix this simple SQL query?

    - by morpheous
    I have a database with three tables: user_table country_table city_table I want to write ANSI SQL which will allow me to fetch all the user data (i.e. user details including the name of the country of the last school and the name of the city they live in now). The problem I am having is that I have to use a self join, and I am getting slightly confused. The schema is shown below: CREATE TABLE user_table (id int, first_name varchar(16), last_school_country_id int, city_id int); CREATE TABLE country_table (id int, name varchar(32)); CREATE TABLE city_table (id int, country_id int, name varchar(32)); This is the query I have come up with so far, but the results are wrong, and sometimes, the db engine (mySQL), asks me if I want to show all [HUGE NUMBER HERE] results - which makes me suspect that I am unintentionally creating a cartesian product somewhere. Can someone explain what is wrong with this SQL statement, and what I need to do to fix it? SELECT usr.id AS id, usr.first_name, ctry1.name as loc_country_name, ctry2.name as school_country_name, city.name as loc_city_name FROM user_table usr, country_table ctry1, country_table ctry2, city_table city WHERE usr.last_school_country_id=ctry2.id AND usr.city_id=city.id AND city.country_id=ctry1.id AND ctry1.id=ctry2.id;

    Read the article

  • Graphics artifacts/distortion with Win7 and nVidia

    - by Gepard
    Problem I encounter is rather hard to describe, so I provide a screenshot: http://dl.dropbox.com/u/1732760/video-distortion.png As you can see there are some horizontal stripes in random colors. These stripes appear sometimes in all windowed apps, games and on the desktop too. They tend to stay in place until I refresh window (or force it to to redraw by for example minimizing and maximizing again). They also tend to appear in the same place and shape multiple times, even if they disappear, it's very likely they will be again in the same place after a while. These artifacts do not blink or change if computer is idling. If I don't touch anything, do not use mouse, they will stay in place forever (unless some app redraws its window on its own). I first encountered this problem some weeks ago. Back then I thought it might be cooling problem, so I took out the graphics card, removed dust from the radiator and fan and put it into PC back. I also ran some stress test using Furmark (peak tempearature was ~65C) to see if the problem becomes more intense if the card gets hotter, but suprisingly no artifacts whatsoever appear during the stress test. Graphic card is Galaxy GeForce 7300 GT with DDR3 memory, was never overclocked. Drivers are the latest, from Nvidia site. OS is Windows 7 64-bit, updated. AMD64 3000+, 2GB RAM. I'm running a dual monitor setup with 2 19'' Samsung LCDs and problem is on both, so I assume it's not a monitor or cable issue.

    Read the article

  • Despeckle line art

    - by Dour High Arch
    We have a number of line-art charts unfortunately saved as JPEGs. They are now riddled with distracting compression artifacts or "speckles". Is there any way of removing these? I do not have the original files and it will be very difficult to recreate them. I am running Windows 7 and tried Paint.Net; none of the filters help. Posterize washed out all the colors and leaves the speckles. Blur makes text unreadable. Noise Reduction wrecks antialiasing of curved lines, and perversely enhances the speckles, making them look like checkerboards. Yes, I have Googled for software to do this; there are many programs that advertise despeckling but, after my experience with Paint.Net, do not want to experiment with applications that show no before and after images. The only example I have seen that does what I want is from a Photoshop tutorial. I have dozens of files and the tutorial requires considerable manual fine-tuning. I would prefer to automate or batch-process this task. Commercial apps are fine, but I do not want to spend over $600 and learning a complex program for a single task.

    Read the article

  • Is there any time estimation about sqlite3's open speed?

    - by sxingfeng
    I am using SQLite3 in C++, I found the opening time of sqlite seems unstable at the first time( I mean ,open windows and open the db at the first time) It takes a long tiom on 50M db, about 10s in windows? and vary on different times. Has any one met the same problem? I am writting an desktop application in windows, so the openning speed is really important for me. Thanks in advance! int nRet; #if defined(_UNICODE) || defined(UNICODE) nRet = sqlite3_open16(szFile, &mpDB); // not tested under window 98 #else // For Ansi Version //*************- Added by Begemot szFile must be in unicode- 23/03/06 11:04 - **** OSVERSIONINFOEX osvi; ZeroMemory(&osvi, sizeof(OSVERSIONINFOEX)); osvi.dwOSVersionInfoSize = sizeof(OSVERSIONINFOEX); GetVersionEx ((OSVERSIONINFO *) &osvi); if ( osvi.dwMajorVersion == 5) { WCHAR pMultiByteStr[MAX_PATH+1]; MultiByteToWideChar( CP_ACP, 0, szFile, _tcslen(szFile)+1, pMultiByteStr, sizeof(pMultiByteStr)/sizeof(pMultiByteStr[0]) ); nRet = sqlite3_open16(pMultiByteStr, &mpDB); } else nRet = sqlite3_open(szFile,&mpDB); #endif //************************* if (nRet != SQLITE_OK) { LPCTSTR szError = (LPCTSTR) _sqlite3_errmsg(mpDB); throw CppSQLite3Exception(nRet, (LPTSTR)szError, DONT_DELETE_MSG); } setBusyTimeout(mnBusyTimeoutMs);

    Read the article

  • Linear Interpolation. How to implement this algorithm in C ? (Python version is given)

    - by psihodelia
    There exists one very good linear interpolation method. It performs linear interpolation requiring at most one multiply per output sample. I found its description in a third edition of Understanding DSP by Lyons. This method involves a special hold buffer. Given a number of samples to be inserted between any two input samples, it produces output points using linear interpolation. Here, I have rewritten this algorithm using Python: temp1, temp2 = 0, 0 iL = 1.0 / L for i in x: hold = [i-temp1] * L temp1 = i for j in hold: temp2 += j y.append(temp2 *iL) where x contains input samples, L is a number of points to be inserted, y will contain output samples. My question is how to implement such algorithm in ANSI C in a most effective way, e.g. is it possible to avoid the second loop? NOTE: presented Python code is just to understand how this algorithm works. UPDATE: here is an example how it works in Python: x=[] y=[] hold=[] num_points=20 points_inbetween = 2 temp1,temp2=0,0 for i in range(num_points): x.append( sin(i*2.0*pi * 0.1) ) L = points_inbetween iL = 1.0/L for i in x: hold = [i-temp1] * L temp1 = i for j in hold: temp2 += j y.append(temp2 * iL) Let's say x=[.... 10, 20, 30 ....]. Then, if L=1, it will produce [... 10, 15, 20, 25, 30 ...]

    Read the article

  • Calling base class constructor

    - by The Void
    In the program below, is the line Derived(double y): Base(), y_(y) correct/allowed? That is, does it follow ANSI rules? #include <iostream> class Base { public: Base(): x_(0) { std::cout << "Base default constructor called" << std::endl; } Base(int x): x_(x) { std::cout << "Base constructor called with x = " << x << std::endl; } void display() const { std::cout << x_ << std::endl; } protected: int x_; }; class Derived: public Base { public: Derived(): Base(1), y_(1.2) { std::cout << "Derived default constructor called" << std::endl; } Derived(double y): Base(), y_(y) { std::cout << "Derived constructor called with y = " << y << std::endl; } void display() const { std::cout << Base::x_ << ", " << y_ << std::endl; } private: double y_; }; int main() { Base b1; b1.display(); Derived d1; d1.display(); std::cout << std::endl; Base b2(-9); b2.display(); Derived d2(-8.7); d2.display(); return 0; }

    Read the article

< Previous Page | 63 64 65 66 67 68 69 70 71 72 73 74  | Next Page >