Search Results

Search found 9788 results on 392 pages for 'character limit'.

Page 20/392 | < Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >

  • Home server running 2008 R2 intermittently bringing our internet down by creating a large amount of connections [closed]

    - by Philip Strong
    Possible Duplicate: My server's been hacked EMERGENCY My Server 2008 R2 home server is intermittently (every 30 minutes or so, for about 3-4 minutes) creating a huge amount of connections which reaches the 4096 connection limit, thus effectively DOSing our internet connection. I've run a couple of network traffic monitors, and it appears to be a system process causing the problem. I thought I'd fixed it by reinstalling Comodo Antivirus, but it appears that wasn't the problem. Any thought? Thanks in advance.

    Read the article

  • FreeNas on Dell Powervault 745N: 2TB Limit?

    - by willoller
    I want to purchase 2 x 2TB drives, and install FreeNas on my Dell Powervault 745N. People on the internets seem to be having trouble with the MD3000 firmware, and I want to make sure I can solve any issues before buying the drives. Before I invest, I have 3 questions : Is there a partition size limit determined by the RAID controller? That is, could I have a striped 4TB partition? The spec sheets make me wonder if the RAID controller needs all 4 drives in order to work. Is there any reason this will have to run in RAID5? If I buy 4 matching drives, would the controller support a RAID6 configuration? I'm basically new to all this RAID stuff - sorry for any noob questions.

    Read the article

  • Outlook Calendar Error "The attachment size exceeds the allowable limit", even when there is no attachment

    - by Laura Kane-Punyon
    I have an outlook meeting series that occurs weekly. Every week, I open the current occurrence, attach the current week’s meeting materials and send an update to the invites. I have done this for the same meeting series for several years. Recently, I started to run into a problem where every time I try to update or cancel a meeting occurrence I received the error “the attachment size exceeds the allowable limit” regardless of whether or not I have an attachment. I am not even able to cancel the series in order to start fresh. I have 2 questions regarding this issue: Is there any way to send attachments on the occurrence of a meeting invite without running into this issue? Is there any way to remove this meeting series?

    Read the article

  • Postfix, limit destination concurrency

    - by Hamlet
    Hi, Website sending email using phpmailer - Centos, postfix, php, mysql etc Emails are getting delivered to all hosts correctly except one. Mar 30 14:38:22 server postfix/qmgr[15467]: 7237D218852D: to=, relay=none, delay=0.04, delays=0.04/0.01/0/0, dsn=4.4.2, status=deferred (delivery temporarily suspended: lost connection with mail06.indamail.hu[91.83.45.46] while sending MAIL FROM) They told me to limit concurrent connections to 2 to their server. I did, but they still say I connect more than twice. main.cf: default_destination_concurrency_limit = 1 fallback_relay = smtp_destination_concurrency_limit = 1 initial_destination_concurrency = 1 Any ideas? Thanks, Hamlet

    Read the article

  • Postfix Stagger/Rate Limit Outbound Mail

    - by GruffTech
    We have a server that sends our weekly newsletter to subscribers, To prevent people like Hotmail or Yahoo from blocking us due to sending too many simultaneous emails to them, Is there a way we can stagger email, or rate-limit outbound emails from postfix? Keep in mind, I dont want the mailserver to stop queueing mail or accepting new messages, Simply defer delivery if there are more then 3-4 messages per destination domain/ipaddress, or something similar. Note: I dont want a Sender Throttle, as described in a similar question, here. I'm looking more for a recipient throttle but haven't had any luck finding out how to do so with PolicyD or Anvil services, and was wondering if anyone else has accomplished such a task.

    Read the article

  • Creating an app pool a month to limit the scope of issues

    - by user39550
    I have about 360 sites running on a single app pool. Now I know we have a coding issue with one of those sites, were we have accidentally coded a memory leak. So what happens is the site runs, the memory leak starts and soon the app pool runs out of memory. Then slowly but surely, the rest of the 360 sites start going down like a domino affect. I understand that the root of the problem is some bad coding, which we'll fix, but instead of bringing down said 360 sites, I was thinking, we could create a new app-pool monthly that every site we create would go into that months app pool. First, that limit the scope of the issues to 5 - 20 sites and second if one site started having issues we wouldn't be bringing down all 360 sites. Is there any issues to this thinking, possible ramifications? Thanks in Advance! Jeremiah

    Read the article

  • PHP-APC is exceeding limit apc.shm_size often

    - by user142741
    I am implementing the apc on a shared server that currently has 1000 sites (using wordpress, moodle, etc.). I'm Looking for the admin page, and I see "Cache full count" is growing rapidly. I've tried increasing the value of "apc.shm_size" reduce value "apc.ttl" increase the value "apc.shm_segments" but I can not resolve this issue. What am I doing wrong? I'm putting down some information: apc.ini: extension=apc.so apc.shm_size = 256 apc.enabled = 1 apc.ttl = 300 apc.user_ttl = 300 Ubuntu: 12.04 PHP: 5.3.10 APC: 3.1.7 Server has 16GB memory Limit share memory: 256MB

    Read the article

  • how to limit disk space per user in a PHP web application & CentOS

    - by solid
    we have a web application written in PHP and we want all our users to be able to upload images for e.g. 50MB. We will create a directory structure so that every user has its own folder like app/user1/images app/user2/images ... Now everytime a user uploads an image, we need to check if this is still allowed or not but we don't want 1000 users to continously scan our hard drive counting file sizes in their directory. So writing a script that counts all file sizes in a user directory is not an option I guess? Is there an easier way to calculate used up space per user and limit our app accordingly?

    Read the article

  • SQL Server User Mapping - Limit view of databases for a user

    - by Jaime
    Hi there, I am adding a new Login with SQL Server Authentication. I set its Server Role as public and then went into User Mapping, selecting the only database this user should have access to. I then change the Default Schema to dbo and made this user the db_owner. I then connect to the instance using the new user's credentials and I can see not only the database he should have access to but all the other attached databases. How can I limit this user to just see the database he has access to? Thanks in advance!

    Read the article

  • How to limit disk performance?

    - by DrakeES
    I am load-testing a web application and studying the impact of some config tweaks (related to disk i/o) on the overall app performance, i.e. the amount of users that can be handled simultaneously. But the problem is that I hit 100% CPU before I can see any effect of the disk-related config settings. I am therefore wondering if there is a way I could deliberately limit the disk performance so that it becomes the bottleneck and the tweaks I am trying to play with actually start impacting performance. Should I just make the hard disk busy with something else? What would serve the best for this purpose? More details (probably irrelevant, but anyway): PHP/Magento/Apache, studying the impact of apc.stat. Setting it to 0 makes APC not checking PHP scripts for modification which should increase performance where disk is the bottleneck. Using JMeter for benchmarking.

    Read the article

  • windows 2003 remote desktop configuration - "Active session limit" greyed out

    - by wes
    I have a terminal server which works fine except for one thing: users are logged off after 2 hours, regardless of activity. I have Override user settings checked in the appropriate control window, and "End a disconnected session: Never" is set. But, I found the "Active session limit" is greyed out so I can't change it, and is set to 2 hours. The user (only 1 actually needs a session on this server for more than 2 hours at a time) is able to reconnect to his session immediately. http://the-wes.com/images/active-session-disabled.jpg Any ideas? thanks, -wes

    Read the article

  • SBSMonitoring.mdf reached limit

    - by Bastien974
    I have SBS 08 Standart. I have some Error in my Event Viewer with MSSQL$SBSMONITORING Event id 1105, 1827: Could not allocate space for object 'dbo.EventLog'.'PK_EventLog' in database 'SBSMonitoring' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup. CREATE DATABASE or ALTER DATABASE failed because the resulting cumulative database size would exceed your licensed limit of 4096 MB per database. I tried to schrink the database, worked for SBSMonitoring_log.LDF but nothing for the SBSMonitoring.mdf, still 4096MB. I don't know how to reinstall the monitoring. Thanks for your help.

    Read the article

  • Limit disk I/O one program creates?

    - by Posipiet
    Hardware: one virtualization server. Dual Nehalem, 24GB RAM, 2 TB mirrored HD. Software: Debian, KVM, virt-manager on the server with several virtual machines that use Linux too. 2 TB Disk is a big LVM, each VM gets a logical volume and makes its own partitions in that. Problem: One of the programs that runs on one of the VMs creates huge disk load. This never was an issue, because the program never ran on such a powerful hardware. Now the CPUs are fast, and lots of I/O is the result. We cant do much against that at the moment, because the tool is a black box. On the other hand, the speedy computation is welcome. The program creates about 5 GB of temp files which get overwritten during the next iteration. Question: How can we limit the disk I/O for the process?

    Read the article

  • Limit maximum incoming connections to a port using iptables

    - by Harley
    I have a server that has apache listening on a number of ports. Some ports are used for configuring the server, and another is used to download large files. My problem is that when I have a large number of clients downloading files, the web interface is uncontactable. I would like to limit the number of clients connecting on the "large file" port so that apache always has available connections to configure the server. A REJECT is fine, the client trying to download the file will back off and retry later. Each client only has one connection open to the server at a time, so limiting by IP won't work. I know I could put something in front of apache to manage this, but I'd really like to do it in iptables, without adding more software.

    Read the article

  • Limit ftp users to only certain directories in Ubuntu

    - by George
    There are several questions around limiting ftp users to certain directories. However, most of them refer to vsftpd, which I don't think I have installed on my system. I'm running Ubuntu 9.04. How can I tell what ftp service I have installed, and then limit certain users to only the /home/ftpuser directory instead of having full access to the file system? I think I can add them to a separate group and give that group access to the proper directories, but then do I have to remove that groups permissions from all other directories? It seems like there should be an easy way like setting the chroot_local_user value in the /etc/vsftpd/vsftpd.conf file, but that doesn't exist on my system.

    Read the article

  • Squid Proxy Server: limit total bandwidth

    - by Pitto
    Hello my dear friends! We have a marvellous squid proxy with dansguardian for filetering and they both work just great. Is there any easy way to limit the total bandwidth usage? I'd like to set the max amount of squid users internet use to 1200 since our total band is 2000 and I need the rest to ensure other services such as voip to work without hiccups related to huge downloads on the "internet side" of our connection and similar issues. I mean a total squid bandwidth limitation and not a user-based. Fair thanks to everybody.

    Read the article

  • Windows Server 2008: Limit UDP/TCP packets per IP or ban

    - by WBAR
    How I can limit UDP/TCP packets per IP send to my host (or better PORT) per second or minute ? Would be nice to ban that IP for 12/24 hours or even for ever. I got Windows Server 2008 and I'm very poor in Windows administration but quite good in Linux. EDIT: By basic problem is that They sending a lot of rubbish UPD and TCP packets.. TCP packets without SYNCH, fragmented UDP packets so my servers stop responding.. So I need to cut off users (IPs) sending more than X packets per second. I need solution witch provides me, somehow, configurable: X packets of certain type (UDP, TCP or both - lets say parameter named Z ) are allowed to be received by IP on Y port, otherwise this packet should be DROPPED. My virtual hosts are hosted by VirtualBox and I'm able to forward all incoming packets certain type and certain port to the specific Virtual Host, but I need to DROP them before my VirtualBox receive them.

    Read the article

  • Getting around the FAT32 4GB file size limit

    - by Aesir
    I recently purchased a 32GB USB 3 stick that was formatted FAT32. I plugged it into my computer and attempted to copy a film onto it, the file was over 4GB however and it would not let me copy the file across because of the 4GB file size limit imposed by FAT32. After some googling I found that I could format my USB stick to have an exFAT format which would mean I could put files onto the stick greater than 4GB in size and the drive would work on both my Mac and my PC. The problem with this solution is that my PS3 cannot detect the USB stick when it is formatted with exFAT. I would like to know if there is a way I can have my USB stick formatted so it can have files greater than 4 GB and work on my PC, Mac and PS3.

    Read the article

  • NAT Error Message - Usage limit exceeded

    - by Kato
    Trying to configure a port to use for Vuze. Using the NAT/server port test, I was getting a message saying the connection timed out, the port was probably closed. I went back and made sure to open the specific port on my router, firewall, etc, but now I'm getting "NAT Error - Usage limit exceeded (173.32.41.24:0). I'm on a mac pro running leopard, trial versions of intego netbarrier and virusbarrier. Network utilities and netbarrier both claim the port is open. I've tested a bunch of other ports, but all give the same message.

    Read the article

  • Programatically Determine Exchange Attachment Limit

    - by Jeff Ballard
    Is there any way to query the exchange server to determine the maximum attachment file size? I'd be doing this in ASP.NET/C#. I'd like to be able to validate the file they want to attach is not over the limit before the user attempts to send the file to the server as opposed to having the server send back an exception when it attempts to attach the file and it discovers the file is too large. I've also posted this question about this on stackoverflow.com as well - I figured a sysadmin for Exchange may have an answer as well as a developer. Hopefully I do not incur the wrath of the stackexchange gods.

    Read the article

  • Commit charge peak higher than system limit

    - by Grubsnik
    We are seeing some very strange behaviour on our servers and google didn't turn up anything usefull, so I'm tossing it out here. A standard server is configured with 4GB Ram, 2 4GB pagefiles and running windows server 2003. The servers are running 50-120 vb6/.net applications which normally consume no more than 100mb of memory, but will occasionally run up to 300 mb. The issue with a single process spending way too much memory is being traced down somewhere else, but the thing that is baffling us is that the reported peak charge is vastly higher than what we have available. As the image above shows, we are getting reported peaks that are way higher than what the system is actually capable of delivering. This number has been seen as high as 29GB, which makes no sense at all for a system with a limit of 12GB. Does anyone have an idea what is going on?

    Read the article

  • mod_rewrite issue | Request exceeded the limit of 10 internal redirects

    - by Chris Anarko Meow
    ok what Im doing normally works but since my rule "includes" itself is giving me issues and can't find a solution after hours working on different options. I have a .htaccess with: RewriteEngine On RewriteBase / RewriteCond %{REQUEST_URI} !^/3.15.0/(.*) RewriteRule ^(.*)$ /3.15.0/$1 [L] this is for my software versions, I have a program that can request sometimes versions that are updated and in the server may be behind a couple version so I want to be able to say that whatever is comming in forward to the latest version that in this example is 3.15.0 /var/www/nameblabla/3.15.0 my .htaccess is on /var/www/nameblabla/.htaccess so the first Condition is to ignore request that already has the right path and version.. the second should be to grab all request and forward to 3.15.0... and of course not loose the path to the files I want from inside that should be the same. so far I can only get it to redirect to such directory but will loose the path and others I get the "Request exceeded the limit of 10 internal redirects" I guess this is because Im including the 3.15.0 path Any help or another way to do this with out mod_rewrite?

    Read the article

  • How to limit router bandwidth?

    - by David
    Hello, Is there a way to configure my (D-Link DIR-615) router to throttle down the allowed bandwidth after a certain amount of bandwidth has been used? For instance, I want my router to operate normally up to 20GB. After 20GB I want the router to limit bandwidth to a fraction of the normal speed (perhaps 1/5th). I live in Canada, so in about a month, everyone is going to be billed based on the amount they used (usage based billing). Instead of the unlimited bandwidth that I am enjoying now, most people will be capped at 25GB and will have to fork out $2/GB of over usage. Thank you in advance for the help.

    Read the article

  • PowerShell - Limit the search to only one OU

    - by NirPes
    Ive got this cmdlet and I'd like to limit the results to only one OU: Get-ADUser -Filter {(Enabled -eq $false)} | ? { ($_.distinguishedname -notlike '*Disabled Users*') } Now Ive tried to use -searchbase "ou=FirstOU,dc=domain,dc=com" But if I use -SearchBase I get this error: Where-Object : A parameter cannot be found that matches parameter name 'searchb ase'. At line:1 char:114 + Get-ADUser -Filter {(Enabled -eq $false)} | ? { ($_.distinguishedname -notli ke '*Disabled Users*') } -searchbase <<<< "ou=FirstOU,dc=domain,dc=com" + CategoryInfo : InvalidArgument: (:) [Where-Object], ParameterBi ndingException + FullyQualifiedErrorId : NamedParameterNotFound,Microsoft.PowerShell.Comm ands.WhereObjectCommand What Im trying to do is to get all the disabled users from a specific OU, BUT, there is an OU INSIDE that FirstOU that I want to exclude: the "Disabled Users" OU. as you might have guessed I want to find disabled users in a specific OU that are not in the "Disabled Users" OU inside that OU. my structure: Forest FirstOU Users,groups,etc... Disabled Users OU

    Read the article

  • Limit number of concurrent user logins in Windows Server 2008 Active Directory

    - by smhnaji
    Is there the possibility to limit Active Directory users' max concurrent login sessions? I've read many articles and discussions about the solution, but none of them seem to be working. Many had suggested UserLogin script that doesn't work in Windows Server 2008. Some other suggested CConnect that is not good enough. It's also very complicated. Some others have introduced UserLock that should be paid for. It's wondering that Windows Server 2003 DOES have the feature (wile as a third-party), but Windows Server 2008 doesn't have! One of the articles I've read: http://www.edugeek.net/forums/windows-server-2008-r2/61216-multiple-logins.html

    Read the article

< Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >