In C, I would like to limit the string to the first 8 characters. For example, I have:
char out = printf("%c", str);
How can I make it so it only returns the first 8 characters?
I Need a regular expression which accepts all types of characters (alphabets, numbers and all special characters), and miniumum number of characters should be 15 and no limit for maximum characters.
Why I can't construct large tuples in Haskell? Why there's a tuple size limit?
Prelude> (1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1)
<interactive>:1:0:
No instance for (Show
(t,
t1,
t2,
...
t23))
arising from a use of `print' at <interactive>:1:0-48
Possible fix:
add an instance declaration for
(Show
(t,
t1,
t2,
...
t23))
In a stmt of a 'do' expression: print it
I created a Java LinkedBlockingQueue like new LinkedBlockingQueue(1) to limit the size of the queue to 1. However, in my testing, this seems to be ignored and there is often several things in the queue at any given time. Why is this?
In a simple CakePHP model where User hasMany Item (and Item belongsTo User)-
Suppose 50 users, have each 10 items.
How can I find() only 5 users, each with the latest 5 items he has?
When I impose a limit in the find, it only limits the number of users, not the number of associated Items.
Thanks!
I know that each recipient counts toward the Recipients Emailed (billable) quota and I can't send to more than is my daily quota.
But is there a limit how many recipients can a single email have?
For example can a single email created and sent through App engine Mail api have 8000 recipients?
I'm getting an error saying, "Limit exceeded" when I am submitting my sitemap to Yahoo.
Can anyone explain what the error means?
Possible problems:
1. Too many submissions: we certainly don't send the sitemap 5000 times in a day.
2. The sitemap is too big: Our sitemap is a mere 40k file and gzipped.
Thanks.
Hi
We are currently using DataSet for loading and saving our data to an xml file using Dataset and there is a good possibility that the size of the xml file could get very huge.
Either way we are wondering if there is any limit on the size for an xml file so the Dataset would not run into any issues in the future due to the size of it. Please advise.
Thanks
N
I need to run a background PHP script on the server, but it clogs the pipe and chokes and server. How do I limit its resources and allow other tasks to run smoothly along with it.
Will incorporating usleep into the script help?
I want to setup a number of guests with multiple CPUs (4) and at least 4Gb ram running Ubuntu Linux. These machines will mostly be idle, but from time to time their workload will require all their resources, especially the CPU.
The hosts are ESXi 5.x.
The question is, am I right in thinking that the resource consumption of the machines when idle will be negligible?
We know this true of disk and the CPU. The only concern left therefore is memory. Since ESX over-commits memory it makes sense that unused memory of any guest is paged out.
If my thinking correct?
Hello,
We are having some issues with changing the maximum message size in zimbra
We followed the instructions given on the zimbra wiki
zimbra@ZIM1:~$ zmprov modifyConfig zimbraMtaMaxMessageSize 2048000
zimbra@ZIM1:~$ postfix reload
postfix/postfix-script: refreshing the Postfix mail system
zimbra@ZIM1:~$ postconf | grep message_size_limit
message_size_limit = 10240000
Strange thing:
The setting displayed in the admin page does change but not to the corresponding size (2000kb)
It would be much appreciated if someone can help us.
Grtz, Thomas
I want my Windows to run as fast as possible. If I have 12GB RAM in Windows 7 64bit, quad core CPU, and all apps fit in memory, will the swap file ever be used for anything? The question is about if it's a good idea to put the swap file in a RAM disk.
Would a RAM disk help in any way or will Windows intelligently use all the available memory for all its work?
I am also thinking of putting the temp folder on a RAM disk. I know the RAM disk is volatile memory and I don't care about its content if it gets lost.
Running Win 7 64-bit SP1 with 8 GB RAM. I first noticed this problem when using the GUI to copy some large (5+ GB) files from one disk to another. What happens is the physical memory in use rises quite quickly to 100% and the system comes to a crawl. If I just start to access the file in a media player (it is a movie) the memory usage climbs up slowly but eventually reaches 100%.
When copying the same files via XCOPY I do not have this problem.
Using RAMMAP I see most of the memory usage is under "Mapped File" and is allocated under the "Active" column. If I select "Empty System Working Set" the RAM usage drops back down but then starts to climb back up.
Any ideas on what I can check/test to eliminate this issue?
I have an issue with a Dell Inspiron 15 (1545) laptop that refuses to open any applications (save select Microsoft programs, e.g. Security Essentials, Ctrl Panel, Windows Explorer (not Internet), regedit, Event Viewer, etc.). I've run Microsoft Memory Diagnostics Tool and it found a 'hardware problem was detected.' Does this indicate that the RAM has failed? I notice when I open programs like Word, Excel, Internet Explorer, etc., it always give me an error from WerFault.exe saying The instruction at xxxxxxx referenced memory at xxxxxxxxx. The memory could not be written. and sometimes something about illegal instructions.
If it is a hardware problem, does this mean that replacing the RAM is my only option? Again, I would also like to know if RAM can fail (like hard drives) and if malware can cause RAM to fail also.
I have a Toshiba A100, which I upgraded to 4G of RAM. The hardware startup indeed shows 4G of RAM, and I recently installed Windows 7, just to see how it behaves on it. So far so good, it displays 4G of RAM. Not that I tried to use it all, but it displays it. Previously under XP, I also would see 4G of RAM.
But under Ubuntu 9.10 (32 or 64 bits), it only displays 2.9 G of RAM. And my kernel is the "pae" compiled one, which is supposed to do the trick to work around the 32-bit CPU limitation.
How can I get Ubuntu to fully use my 4G of RAM ?
I use Mozilla Firefox 3.6 all day, opening and closing tabs quite regularly. I am noticing over time that the firefox.exe process size keeps growing and growing over time.
Initially I put this down to memory fragmentation caused by opening and closing tabs, but now I am suspecting that there is a memory leak in one of the add-ons that I have installed.
The problem I am seeing is that when the process size gets to about 1.5GB in the "Mem Usage" stat in Task Manager (and it gets there quite regularly), Firefox freezes up.
Does anyone have any ideas about how I could diagnose whether:
Any of the add-ons are leaking memory?
Something else is causing this problem?
For instance, i have video file which is 11.8 Gb, but my RAM memory only 2 Gb..
How does VLC (or other software) handle it? How do they load it into memory?
I used VMMap tool (from sysinternals) to take a look at memory, and i saw:
private 160000K
working set 100000K
Obviously, it's much less than 11.8 Gb -So how did it happen?
This question is not only about video.
I'd like to know how computer, in general, handles very large files.
So it's a well known fact that the PowerMacs manufactured before 2002 cannot take a harddrive larger than 128GB. I have an old B&W that was running 10.4, and upon putting a 250GB drive inside, it told me that I had inserted a 128GB. That was expected. However, I recently decided to turn that machine into a Debian home file server. I shoved the 250GB drive inside, did some formatting, and now it tells me that it is a 250GB drive. Is this safe to use? Will all my data go corrupt after I've added more than 128GB of stuff?
In case the specs are helpful to have, it's a 400MHz B&W, 1GB RAM, Rev. B.
I'm using FastCGI via IIS7 to host a PHP application. For whatever reason downloads that are streamed via PHP (i.e. a script that outputs a file/bytes as a response) download perfectly fine on high-speed connections but on anything slower (even on DSL) quits at EXACTLY 4128760 bytes (~3.9MB) - which makes me think it's a configuration issue... We only started having this problem when we switched from Apache to IIS - this also points to a configuration problem, I think.
But if it's a configuration issue, why would it only affect slower connections? Does anyone know where (or how) I could possibly change a setting like this? I've tried changing the idleTimeout, executionTimeout, and activityTimeout values in my web.config but this hasn't helped at all. Any help or direction would be greatly appreciated. Thanks in advance.
Hi, I know there is this question already discussed, but I still don´t understand something, so please just help me clarify it.
What I understand there is 2 way to do I/O aka communicate from CPU with other HW.
One is to use in and out instructions, and second is the memory mapped.
But what I don´t actually understand is, is IN and OUT instructions are used, you define source port. But what is this port? I mean, is it different set of pins on CPU or what? And, to what is that port connected?
And for the memory mapped, I miss just a tiny detail. Wheather memory mapped I/O must be first set by IN and OUT instructions, or does the device actually somehow itself connects to the RAM and reads it? Thanks.
I have been googling this issue for a long time, and no full solutions seem to be out there.
For certain mkv files, windows hangs when moving, copying, or deleting. These files play fine in a player such as GOM player.
System: a fast windows 7, 64 bit box.
Results:
CPU change is negligible
Memory usage rockets up to ~100%
In cased of delete or move, "discovering files" dialogue box stays up for a long time
Rename simply shows spinning icon until it finishes
Action is completed after EXTREMELY long amount of time
Memory usage does not return to normal
"Fixes:"
Disable thumbnail creation (helps for some cases)
After move/rename/delete kill explorer with taskmanager, and relaunch to re-claim your memory
Even with the thumbnails turned off, the issue persists. I have also tried re-muxing a file, which worked fine, but still resulted in a file with the same above issues.
Hi there,
I did add a new slow transport to my Postfix configuration but this doesn't looks to work Messages passes correctly in the slow transport but they aren't rate limited.
Currently, I'v been setting this up in my master.cf:
slow unix - - n - 1 smtp
-o default_destination_concurrency_limit=1
-o initial_destination_concurrency=1
-o smtp_destination_concurrency_limit=1
-o in_flow_delay=2s
-o syslog_name=slow
Any idea why my messages aren't rate limited?
Regards,
I have this problem on my workstation. The computer would effectively freeze for 2-5 seconds for no apparent reason, then continue as normal. While frozen the mouse would still be movable, but only on one of screens in my multi-screen setup. What can be the likely cause.
System:
CPU: i7-920
Memory: 12G of Patriot DDR3, 6 modules
OS: SLED 11, Suse Linux Enterprise Desktop, using Gnome
Main board: Asus P6T
Video: two Nvidia 9500GT connected to three displays
I am using memory at recommended settings of 8-8-8-1333. It has an XMP profile. Th CPU is a bit overclocked to 3.3 GHz, but my cooling more than allows for it. I ran the computer with all overclocks off and lower memory speed but the issue was still there.
Any ideas? Where should I start looking?
I've achieved a lot on blocking attacks on GameServers but I'm stuck on something. I've blocked major requests of game-server which it aceepts in the form "\xff\xff\xff\xff" which can be followed by the actual queries like get status or get info to make something like "\xff\xff\xff\xff getstatus " but I see other queries if sent to the game-server will cause it to reply with a "disconnect" packet with the same rate as input so if the input rate is high then the high output of "disconnect" might give lag to the server. Hence I want to block all queries except the ones actual clients use which I suppose are in the form "\xff\xff\xff\xff" or .... so,
I tried using this rule :
-A INPUT -p udp -m udp -m u32 ! --u32 0x1c=0xffffffff -j ACCEPT
-A INPUT -p udp -m udp -m recent --set --name Total --rsource
-A INPUT -p udp -m udp -m recent --update --seconds 1 --hitcount 20 --name Total --rsource -j DROP
Now where the rule does accept the clients but it only blocks requests in the form "\xff\xff\xff\xff getstatus " ( by which GameServer replies with status ) and not just "getstatus " ( by which GameServer replies with disconnect packet ). So I suppose the accept rule is accepting the simple "string" as well. I actually want it to also block the non-(\xff) queries. So how do I modify the rule?
I'm using Ubuntu 9.04 and the kernel 2.6 build-in NETEM tool to delay traffic.
After I apply:
tc qdisc add dev eth0 root netem delay 100ms
The upload bandwidth can't go further than 330KB/s and I have a 100Mbit connection.
How can I fix this so that my upload bandwidth is still full.
Thanks!