Search Results

Search found 15939 results on 638 pages for 'low memory'.

Page 368/638 | < Previous Page | 364 365 366 367 368 369 370 371 372 373 374 375  | Next Page >

  • Feeding the kernels entropy source from other machines and/or increasing its maximum size

    - by David Spillett
    We have has a little trouble with a small box that acts as a VPN end-point and mail relay for our network, caused by the available entropy for /dev/random being too low (which causes TLS connection attempts by exim to fail). The machine doesn't do anything else, so the normal feed into the entropy pool (interrupt timings from things like disk access) is not enough. As a quick hack I've set a looping script that reads from /dev/hda at a couple of Mbyte/sec which keeps it topped up. Other than buying a hardware RNG, is there a clean way of piping data for entry from elsewhere, such as a copy of the data our file server uses for its entropy source? I've spotted several tips for using rng-tools to feed it from /dev/urandom on the same machine but that "feels dirty". Also, is it possible to increase the maximum pool size? It currently seems to max out at 3585.

    Read the article

  • IPhone Development Profile Expired

    - by theiphoneguy
    I really combed this site and others. I read and re-read the related links here and the Apple docs. I'm sorry, but either I am obviously missing something right under my nose, or this Apple profile/certificate stuff is a bit convoluted. Here it is: I have a product in the App Store. I have updated it several times and users like it. My development profile recently expired just when I was improving the app for its next release. I can run the app in the simulator. I can compile and put the distribution build on my iPhone just fine. I went to the Apple portal and renewed the development profile. I downloaded it and installed it in Xcode. I see it in the Organize window. I see it on my iPhone. I CANNOT put the debug build on my iPhone to debug or run with Instruments. The message is that either there is not a valid signed profile or it is untrusted. I subsequently tried to download and install the certificate to my Mac's keychain. Still no success. I checked the code signing section of Project settings and also for the target and the root. All appears to indicate that it is using the expected development profile for debug. Yes, I had deleted the old profile from my iPhone, from the Organizer. I cleaned the Xcode cache and all targets. I have done all of this several times and in varying sequences to try to cover every possibility. I am ready to do anything to be able to debug with Instruments in order to check for leaks or high memory usage. Even though the distribution compile runs fine on my iPhone and plays well with other running processes, I will not release anything without a leaks/memory test. Any ideas will be appreciated. If I missed something obvious, please forgive me - it was not due to just posting a question without searching for similar postings. Thanks!

    Read the article

  • Where are the function literals in c++?

    - by academicRobot
    First of all, maybe literals is not the right term for this concept, but its the closest I could think of (not literals in the sense of functions as first class citizens). The idea is that when you make a conventional function call, it compiles to something like this: callq <immediate address> But if you make a function call using a function pointer, it compiles to something like this: mov <memory location>,%rax callq *%rax Which is all well and good. However, what if I'm writing a template library that requires a callback of some sort with a specified argument list and the user of the library is expected to know what function they want to call at compile time? Then I would like to write my template to accept a function literal as a template parameter. So, similar to template <int int_literal> struct my_template {...};` I'd like to write template <func_literal_t func_literal> struct my_template {...}; and have calls to func_literal within my_template compile to callq <immediate address>. Is there a facility in C++ for this, or a work around to achieve the same effect? If not, why not (e.g. some cataclysmic side effects)? How about C++0x or another language? Solutions that are not portable are fine. Solutions that include the use of member function pointers would be ideal. I'm not particularly interested in being told "You are a <socially unacceptable term for a person of low IQ>, just use function pointers/functors." This is a curiosity based question, and it seems that it might be useful in some (albeit limited) applications. It seems like this should be possible since function names are just placeholders for a (relative) memory address, so why not allow more liberal use (e.g. aliasing) of this placeholder. p.s. I use function pointers and functions objects all the the time and they are great. But this post got me thinking about the don't pay for what you don't use principle in relation to function calls, and it seems like forcing the use of function pointers or similar facility when the function is known at compile time is a violation of this principle, though a small one.

    Read the article

  • How to make VIM settings computer-dependent in .vimrc?

    - by Paperflyer
    I share my VIM configuration file between several computers. However, I want some settings to be specific for certain computers. For example, font sizes on the high-res laptop should be different to the low-res desktop. And more importantly, I want gVIM on Windows to behave more windowsy and MacVim on OSX to behave more maccy and gVIM on Linux to just behave like it always does. (That might be a strange sentiment, but I am very used to switch mental modes when switching OSes) Is there a way to make a few settings in the .vimrc machine- or OS-dependent?

    Read the article

  • Stuck at being unable to print a substring no more than 4679 characters

    - by Newcoder
    I have a program that does string manipulation on very large strings (around 100K). The first step in my program is to cleanup the input string so that it only contains certain characters. Here is my method for this cleanup: public static String analyzeString (String input) { String output = null; output = input.replaceAll("[-+.^:,]",""); output = output.replaceAll("(\\r|\\n)", ""); output = output.toUpperCase(); output = output.replaceAll("[^XYZ]", ""); return output; } When i print my 'input' string of length 97498, it prints successfully. My output string after cleanup is of length 94788. I can print the size using output.length() but when I try to print this in Eclipse, output is empty and i can see in eclipse output console header. Since this is not my final program, so I ignored this and proceeded to next method that does pattern matching on this 'cleaned-up' string. Here is code for pattern matching: public static List<Integer> getIntervals(String input, String regex) { List<Integer> output = new ArrayList<Integer> (); // Do pattern matching Pattern p1 = Pattern.compile(regex); Matcher m1 = p1.matcher(input); // If match found while (m1.find()) { output.add(m1.start()); output.add(m1.end()); } return output; } Based on this program, i identify the start and end intervals of my pattern match as 12351 and 87314. I tried to print this match as output.substring(12351, 87314) and only get blank output. Numerous hit and trial runs resulted in the conclusion that biggest substring that i can print is of length 4679. If i try 4680, i again get blank input. My confusion is that if i was able to print original string (97498) length, why i couldnt print the cleaned-up string (length 94788) or the substring (length 4679). Is it due to regular expression implementation which may be causing some memory issues and my system is not able to handle that? I have 4GB installed memory.

    Read the article

  • Struct Array Initialization and String Literals

    - by Christian Ammer
    Is following array initialization correct? I guess it is, but i'm not really sure if i can use const char* or if i better should use std::string. Beside the first question, do the char pointers point to memory segments of same sizes? struct qinfo { const char* name; int nr; }; qinfo queues[] = { {"QALARM", 1}, {"QTESTLONGNAME", 2}, {"QTEST2", 3}, {"QIEC", 4} };

    Read the article

  • Heroku in real life apps

    - by Victor P
    What is your experience using Ruby on Rails on Heroku in production mode? Apart of the issue of the expensive https, do you see any drawback in the way it manages processes, memory and storage? The people at Heroku is quite nice and I'm sure they are willing to answer my questions, but I would like some opinions in the customer side.

    Read the article

  • Enable System Beep in Ubuntu

    - by Melissa W
    I have tried and tried to get the system beep working, but with no success. I have selected System--Sound--System Beep--Enable Audible Beep (from the Gnome Desktop) I have tried from a Terminal window Edit--General Tab--Selecting terminal bell checkbox I have tried entering modprobe pcspkr at the command line. Trying echo -e '\a' or using the beep application - Nothing works! I know my hardware speaker works, because if on startup the battery is low it will beep. Update: It is a laptop computer. It is an IBM Thinkpad, iSeries. I did look at the modprobe blacklist, and pcspkr was not listed.

    Read the article

  • SAS disk performance drops a while after reboot.

    - by Flamewires
    So we have some workstations with identical hardware. The Fedora14 box has a couple weeks uptime and still get good performance. hdparm -tT /dev/sda /dev/sda: Timing cached reads: 21766 MB in 2.00 seconds = 10902.12 MB/sec Timing buffered disk reads: 586 MB in 3.00 seconds = 195.20 MB/sec The Cent 5.5 boxes however seem to be okay after a reboot, /dev/sda: Timing cached reads: 34636 MB in 2.00 seconds = 17354.64 MB/sec Timing buffered disk reads: 498 MB in 3.01 seconds = 165.62 MB/sec but some time later( unsure exactly, tested at approx 1 day uptime) /dev/sda: Timing cached reads: 2132 MB in 2.00 seconds = 1064.96 MB/sec Timing buffered disk reads: 160 MB in 3.01 seconds = 53.16 MB/sec drop to this. This is with very low load. I believe they all have the same bios settings. Any ideas what could cause this on Cent? Ask for more info. It might also be worth noting, that passing the --direct flag causes the slow boxes to perform similarly to the non-slow ones for buffered disk reads.

    Read the article

  • Tools to automate recording streaming radio

    - by Stan
    Are there any tools that can automate recording online streaming radio? I've been using Total Recorder which has the following useful features: Handy scheduler Supports creating recording templates, so I can customize some high/low quality recording Unfortunately it requires opening the streaming radio in a browser and can't have another sound source at the same time; it's recording what comes out from the speaker. What I am looking for is given an online radio URL, the tool should be able to record the audio stream, no matter if I am playing any other music or not. Does such a tool exist?

    Read the article

  • DUMP in unhandled C++ exception

    - by Jorge Vasquez
    In MSVC, how can I make any unhandled C++ exception (std::runtime_error, for instance) crash my release-compiled program so that it generates a dump with the full stack from the exception throw location? I have installed NTSD in the AeDebug registry an can generate good dumps for things like memory access violation, so the matter here comes down to crashing the program correctly, I suppose. Thanks in advance.

    Read the article

  • How to use 3T SAS HDDs which aren't supported by Dell PERC 6i

    - by Sunry
    We bought several 3T SAS hard disks want to use with our Dell R710 server. But we found our R710 can't recognize the 3T hd, and then we dig out the raid card is PERC 6i which doesn't support it. Contacted with Dell that H700 and H800 support 3T hd, thought we can buy a H700 , but the SAS cable is not available in the market right now. And it seems very hard for us to change the raid card, also no suitable tools to tease the special screws. So I want to know if any other method to use these 3T hds instead of putting them in the warehouse? Can I low formatting it to 2TB, and plug it into Dell R710 to use? Or a SAS to USB adapter to use it as USB external disk? I just want to use them by Dell R710 even external.

    Read the article

  • Project Performance Evaluation and Finding Weak Areas

    - by pramodc84
    I'm working in J2EE web project, which has lots of Java, SQL scripts, JS, AJAX stuff. Its been 5 years for project still running fine. I have assigned with work of performance evaluation on the project as there might be some memory usage issues, DB fetching logic delays and other similar weak performance areas. From where should I begin? Any best practices to make project better?

    Read the article

  • can asp.net routing 3.5 sp1 do this issue?

    - by Fredou
    I have a folder(/MyFolder/) with a dedicated web.config in it that does an impersonating In that folder I have an asp.net file that use Microsoft report viewer 8.0 named MyReport.aspx When I view this folder on my machine, it's working perfectly without issue When I publish my project to the dev server and I'm trying to view the report, I have an issue where the the user that run IIS doesn't have access to something, (rsAccessDenied) Can asp.net routing cause this issue? (I'm not at work right now so I can only go by memory so it will be hard to provide more information)

    Read the article

  • I would like to have a publicly accessable linux box hosted elsewhere. Who provides this service?

    - by Eric Wilson
    I would like to have a general purpose linux server available and publicly accessible. I understand that there are no lack of web-hosting companies, but I might want more control over the machine than is typical. I would want the ability to install software, such as an SVN server, and I would like to be able to expose various port numbers, as I may have a variety of extremely low traffic sites that I would want to have available. Obviously, one option is to host such a machine in my home. Is that my only option? Or is what I describe out there, possible as a virtual machine on a larger server?

    Read the article

  • Increase the compression performance of VPN

    - by Martin
    I am currently switching from a system with HPN-SSH tunnels and enabled compression to something VPN based. I have tried tinc and n2n so far, hamachi requires a library I do not have. In my primitive benchmarks I am not satisfied with the achievable bandwidth compared to the SSH tunnels. In tinc the low LZO setting performed best, but compression is only available in UDP mode. Ideally I would like to have a TCP-based VPN with a multi-threaded compression. Can you suggest me some ideas how to increase the performance? Would it be possible to somehow put a compression filter in front of the tun interface? Or are there any VPN implementations that might be better suited for my needs (fast compression, TCP-based, switch mode, does not have to be super-secure)? I would consider tunnelling Ethernet over SSH, but according to some articles it is not advisable.

    Read the article

  • Text template or tool for documentation of computer configurations

    - by mjustin
    I regularly write and update technical documentation which will be used to set up a new virtual machine, or to have a lookup for system dependencies in networks with around 20-50 (server-side) computers. At the moment I use OpenOffice Writer with text tables, and create one document per intranet domain. To improve this documentation, I would like to collect some examples to identify areas where my documents can be improved, regarding general structure and content, to make it easy to read and use not only for me but also for technical staff, helpdesk etc. Are there simple text templates (for example for OpenOffice Writer) or tools (maybe database-driven) for structured documentation of a computer configuration? Such a template / tool should provide required and optional configuration sections, like 'operating system', 'installed services', 'mapped network drives', 'scheduled tasks', 'remote servers', 'logon user account', 'firewall settings', 'hard disk size' ... It is not so much low-level hardware docs but more infrastructure / integration information in these documents (no BIOS settings, MAC addresses).

    Read the article

  • what does it mean to be "terminated by a zero"

    - by numerical25
    I am getting into C/C++ and alot of terms are popping up unfamiliar to me. one of them is a varible or pointer that is terminated by a zero. What does it mean for a space in memory to be terminated by a zero. I am not sure if I am saying it correctly, if not then please correct me. Thanks!

    Read the article

  • How to reinstall OS X 10.8 without internet?

    - by Sam
    I had a crash on my Macbook Air MD231 - Mid 2012, and my system won't boot, and unfortunately my friend erase my partition from disk utility. Now I have three question 1)my internet speed is very low, for re-installing I have to download all of the OS X 10.8 from internet? or I can download it and install it via USB port?(how??) 2)I have to select this option ? "re-install OS X" , and what about size of that? (for example 4.2 Gb) 3)after installing , is there anyway to restore my dada? unfortunately I have not any backup right now, after I turning my Mac on, it's automatically going to repair mode... thnx for your kind

    Read the article

  • In Varnish, is it normal for the number of freed bytes to be 60% of those allocated?

    - by user331397
    I have an installation of Varnish 3.02 on an Amazon EC2 Medium Linux instance in front of two relatively low-traffic websites. After an uptime of 2 hours, there are 3400 objects in the cache. Using varnishstat, I checked the variables SMA.s0.c_bytes and SMA.s0.c_freed, which I assume correspond to the total number of bytes allocated since startup and the number freed, respectively. No objects should have had time to expire during these two hours, but still about 60% of the memory allocated since startup (330MB out of 560MB) has already been freed. Do you know if this is normal? If not, do you know what kind of configuration could be wrong?

    Read the article

  • What is the standard way to deploy memcached?

    - by Tom
    I have two IIS servers and two MySQL servers. On all servers I have some available memory (More than a GB). What is the standard way to deploy memcached? Should I add a new server especially for memcached, or should I use my existing servers? If so, which servers? Thanks in advance

    Read the article

  • Microsoft CALs for Domain Controller

    - by Damo
    I am designing a network and I've come to the point of specifying out the number of CALs required for this network. Microsoft licensing has always confused me, it's just not always clear to me. I plan to have 1 2008 std domain controller, another 2008 server (not a domain controller) and 200 Windows 7 devices connected to the domain for domain services. The 200 W7 devices will all authenticate to the domain controller with the same domain account. (this is a special type of network, not a user workstation network) Therefore, do I need to purchase 200 CALS for the 200 devices, or can I purchase say 10 CALS (user CALS) as the amount of unique user accounts is very low. Many thanks for looking.

    Read the article

  • Ubuntu/Linux version recommendation for HP dv6 3122TX?

    - by sanjayav
    I purchased a HP dv6 3122TX recently and after installing Ubuntu 10.10 64 bit I ran into multiple issues like, The wireless driver is not supported by Uubntu. (1) [The driver is RaLink RT3090 ] The ethernet stopped working sometimes for no reason [The driver is Realtek RTL8111/8168B ] "Corrupted low memory at ..." issue which is described as a kernel bug in Ubuntu support forums. (It started to take me to a terminal instead of the GUI and couldn't start x server after that) As I'm not an expert Ubuntu user I got fed up of all these issue and got back to Windows 7. But I need an Ubuntu installation up and running for my development work. What are your suggestions about a reasonable Ubuntu version that I should try? Or a different Linux variation? Should I stick to a 32-bit version? It'd be great anyone can give some advice on this issue.

    Read the article

  • Does the size of monitor Matters ?

    - by Arsheep
    I have a old computer , i want to buy a big LCD now the best i can found is Viewsonic's 24" lcd TFT monitor . So will it run without any problems or i need to upgrade the video cards or something too ? The computer is not that much old it has P4 bord and celeron processor with 128 graphics memory . And in properties it shows i can maximum use 1280 x 1024 resolution. I am noob hardware wise So need help on this stuff. Thanks

    Read the article

< Previous Page | 364 365 366 367 368 369 370 371 372 373 374 375  | Next Page >