Search Results

Search found 9128 results on 366 pages for 'big theta'.

Page 160/366 | < Previous Page | 156 157 158 159 160 161 162 163 164 165 166 167  | Next Page >

  • Fatal error: Out of memory (allocated ...) (tried to allocate ... bytes) not due to memory_limit setting

    - by Lorenz Meyer
    Since a few days, I get the following error on my server: Fatal error: Out of memory (allocated 262144) (tried to allocate 393216 bytes) Usually this error is due to a memory consumption that is exceeding the configured memory_limit, but in my case there is no relation. The memory_limit is set to 128MB, and in this case, we not even reach 1MB. Also the server does not have a big load, in fact it is an intranet server, and there are just a few people conected to it. System: Windows Server 2003, 1Go RAM, only 600 MB used. Apache 2.2.4 PHP 5.2.3 This error is appearing randomly. The memory limit reached also is randomly between a few kB to a few MB. Sometimes restarting Apache is required to get rid of the error, sometimes it disapears itself. Restarting Apache or the entire server helps temporarily. Where could this problem come from ? How could I narrow down the error source ?

    Read the article

  • When can an FTP server close its passive connections?

    - by Don Kirkby
    Does the FTP protocol allow the server to close any of its passive connections while the client is still connected? Can it tell when the client is finished receiving and then close the connection? I'm including an FTP server in my application using the pyftpdlib Python project. I've got it to work in active and passive mode, but I'm a bit concerned about when it closes its passive connections. I've tried connecting to it with both FileZilla and the default ftp command in Ubuntu, and in both cases, I get a new passive port for every request. That is, if I sit in the root folder and type ls 10 times, I use up 10 ports. This means that I have to allocate a big block of passive ports for the FTP server to use so it won't run out. As soon as the client disconnects, the server releases all the passive connections associated with that client and those ports can be reused. However, a long-running connection could use up a lot of ports.

    Read the article

  • Bigger ProjectServer farm is performing worse

    - by MSPS DBA
    I am using Project Server 2007 sp3 with SharePoint 2007 sp3 and SQL Server 2008 r2. I have recently moved my farm from 2 servers (1 DB and 1 App/Web) to a very big farm having Many Servers, Clustered Database, Load Balancer, Powerful processors and Large RAM. This Farm has more than one Web Servers, Project App Servers, SharePoint App Servers and a separate Index Server. But the performance of Project Server in the new Farm has been downgraded. Views are taking even more time to load data and Project publishing time has also been increased. I am also facing deadlock problems which are causing the project server queue jobs to fail. Could anyone inform me that what would be the reason of this problem and what should be the starting point to look into the issue? Is it mainly because now the application server needs to communicate with other application servers which were not needed in the previous farm? Thanks!

    Read the article

  • Is it possible to FORMAT an external hard disk that has been encrypted using Storagecrypt?

    - by Pandian John
    Basically the big problem is that about 680 GB of data from my Seagate 2 TB Ext HD is lost because I was experimenting with a software called storagecrypt. I used it a few months ago and today I tried it again but i didn't know that the old password is already set in the hard disk when I pressed the encrypt button. I forgot the password which is disappointing. Not to mention that software uses 128 bit AES encryption so there is no way iam going to recover that data. My question is that is it possible to Format my Hard disk which has been encrypted? What i mean is that is it possible to completely wipe the data just like it is newly bought so that I can use my External Hard disk?( I tried to format by right click-- Format. But the size of the disk is shown as 1 MB. Answers would be very much appreciated. Thanks.

    Read the article

  • Any limitations for putting an SSD in a Mini? How fast would an external HDD be via Firewire? Is Ser

    - by Cyrcle
    I'm considering getting a Mini for web programming. I do a lot of text searches so I want to put a SSD in it. Does the Mini have any limitations that might effect the performance of a SSD? I'm trying to decide if I should get a Mini Server. I'd like to be able to have two internal drives so one can be SSD for OS and the code I'm working on, and the other can be my storage drive. However, I'm not sure if I'll be using the extra functionality of the server edition OSX or not, so I'm reluctant to pay the $200 premium. In a "regular" Mini I could put the SSD internal and use an external big drive, but would the external drive be fast enough via Firewire? Thanks in advance for any info.

    Read the article

  • Troubleshooting "connection reset" error on my linux server

    - by Chris
    I fervently hope someone here can help me with the problem I am experiencing. I am a programmer, and I have very little understanding of linux sysadmin terminology/concepts. I am attempting to troubleshoot a problem with my website. It is a Facebook app, and whenever I try to connect using Chrome, I get an error stating that the "connection was reset". I have been Googling for four days straight trying to find a solution to this problem, but no joy. A big part of the problem is that I do not understand the terminology being employed, and the output from many of the tools referenced is likewise indecipherable to me. I am running a VPS with CentOS 5, apache, PHP, and MySQL. I could spam this post with a ton of information from my iptables, apache, etc but if anyone needs information from my server, please let me know how to get it, and I will post it here. Thank you for any help you can offer!

    Read the article

  • Enumerating network shares with NetBIOS

    - by Karrax
    Hello, I have a case where I need to find all connectable shares on my network, and preferably as much information about the share possible. I could do this manually but its quite a big network and it would be too slow. If I did it manually Im guessing I would do something like net view net use //hostname <browse it manually> This would however not give me hidden shares so its not a viable option. Does anyone know of a tool which can help me out in this case? I already tried Sysinternals ShareEnum but it did not work properly. It did a half decent job, but it gave me access denied on tons of shares that was actually open. Any tips in how I can script this is also appriciated. Thank you

    Read the article

  • Illustrator "Save for Web & Devices" returning crappy, pixelated images

    - by Tory Waterman
    I'm trying to create a nice title for my webpage... a big white title to sit on a black background. I'm using Illustrator to do so. When I create it, it looks nice, but when I hit "save for web & devices", it comes out looking like a pixelated piece of crap on the site. Is there some setting I need to change to make Illustrator save a higher resolution image? Thanks EDIT I understand, from looking at some other posts, that this may be a result of "posterization" or "dither", but this is only a plain white image so I don't how this results in a colors problem. (I could be completely misinterpreting these terms) EDIT Figured version might be important... I'm using CS5.1

    Read the article

  • Nginx save file to local disk

    - by Dean Chen
    My case is: In our China company, we have to access one web server in USA headquarter through Internet. But network is too slow, and we download many big image files. All our developers have to wait. So we want to setup a Nginx which acts as reverse proxy, its upstream is our USA web server. Question is can we make Nginx save the image files from USA web server into its local disk? I mean let Nginx act as one cache server.

    Read the article

  • How to determine the source for wakeup in hibernate

    - by Erik
    I have a big problem with my home theater PC that runs Windows 7 64Bit. Normally, I send that PC to hibernate every evening, but from time to time, it keeps waking up for no obvious reason, and stays on until I realize, which is sometimes half a day later :( I have already checked for Windows update, which is not set to automatical, since I prefer installing updates manually. When I look in the system event log, there is an entry called "Power Troubleshooter" which tells me that my system was reactivated at a specific time, but it also says: Source = Unknown, which is the most annoying part. So how can I actually figure out, which process reactivates the PC? Is it possible to set a group policy which forbidds applications or services from scheduling tasks that allow waking up from hibernate at all?

    Read the article

  • Install Windows XP using USB

    - by AmanBe
    How to install windows xp from usb ? I have the iso image. My cdrom is not working. I read up something on internet about this issue but all the articles are just way too complex and big + they are all different so don't know which one to try. I want to know if someone has tried something like this and to tell me what's the best and easiest way, like some tool that will automatically write the iso file onto the flash drive and make it bootable or smth. Thank you in advance.

    Read the article

  • How to limit my CPU power programmatically on Windows 7?

    - by Ivan
    Whenever I run a CPU-heavy activity (like compressing a big set of files into an archive for example) my CPU switches to its full throttle (maximum frequency) and shuts down of overheat in less than a minute. Instead, I would like it to keep slowed-down slightly to do the task a bit slower but be able to reach the finish. At the same time I don't want to dim my screen brightness or adjust anything else what standard Windows power-saving system does. So how do I actually set a cap to limit my CPU power? The CPU is Core 2 Duo T7250, the OS is Windows 7 32-bit, there seem to be no BIOS settings or jumpers available to configure the frequencies.

    Read the article

  • Read ahead buffering while playing video file from optical disc

    - by Saxtus
    I was wondering if there is a program for Windows 7 (64-bit), that reads ahead the file to be played back (usually MKV files in my case) to the RAM, so the disc that the video file resides in, won't spin for the entire duration of the playback, but only every time the cache is almost empty (with big enough cache so it won't need the drive for long periods). A program that I've used in the past (called DVDIdle), was doing that universally for every video player I wanted, but they've stopped updating it 6 years ago and now it doesn't seem to work with Windows 7 (tried using compatibility mode too). The method I am using now, is to either have the drive wearing down and buzzing all the time or manually copy the entire file to HDD, SSD or RamDisk and play it from there. The closest thing I've found, is a software that slows down the drive's spin speed, but I was looking for something more convenient, automated, without waiting for an entire file to be copied before starting playback or needing any HDD space, like I've used to in the past. Thanks in advance.

    Read the article

  • internet speed and routers are controlled by whom

    - by Ozgun Sunal
    i need to learn two things. each is related to other a bit. The first one is, while our LAN speed is usually 100 Mbps or at gigabit levels(very big compared to WAN speeds), WAN speed for instance DSL connections are far less than this. However, we are able to download huge files at those Mb speeds. Isn't this weird? [my real concern is why WAN speed is lower than LAN speeds] Who controls those routers around the large Internet? (while we, as web clients are connected to Internet, packets travel through those routers to the destination network/s).But, are those routers all inside the ISP network and if not, who controls those large numbers of routers?

    Read the article

  • Reocurring unpack failed on git repo improted from svn

    - by xavier
    I have a git repo created from svn with git-svn. Everything converted just fine, but from time to time, when I try to git push, I get: error: unpack failed: unpack-objects abnormal exit Other repos on our server (created from scratch or imported from svn) work fine. The solution is usually to unstage, commit and push files one by one, modify the one that fails (e.g. add a whitespace or something) and commit it once again. It's obviously very irritating, for big commits it's a productivity killer - and requires a lot of server pushes. I'd be grateful for any suggestions on where to look, I couldn't google anything up.

    Read the article

  • How to prevent people taking software home?

    - by Robert MacLean
    Most companies I have worked at have had either a collection of disks or a network share with the installs of the commonly used software in them. This is to allow the IT dept and skilled users to install the software they need on their work machines very easily. However some users would see this as an opportunity to get "free" software for their home machines. I've seen the draconian approach of locking the machine down completely, but that does not work well (in my view - if you disagree feel free to comment on it) because You add so much extra work to IT Users get that big brother feeling So how do you find a way to prevent users from taking home software but still allowing them to install what they need? You can make the assumption that most of the users in the organisations I work in are smart enough to install software, I'm not worried about the tea lady here.

    Read the article

  • rsyslog server - Can you split up and organize logs?

    - by Jakobud
    I recently setup one of our servers as an rsyslog server. I now have our firewall setup to log everything to that rsyslog server. But there doesn't seem to be an organization of the logs. All the firewall logs are just being dumped into the /var/log/messages on the rsyslog server. I guess I was maybe expecting them to at least be in a machine specific log file or directory. How can I organize the incoming logging? If I setup 20 servers to all log everything to a central rsyslog server, I really don't want everything being dumped into one big file or a few files. How can I setup rsyslog to tell it where to log what? Like if all the logs for a specific server were in it's own directory/file, etc... Is this possible?

    Read the article

  • How can I send super large files directly to another computer in the Internet for free?

    - by Cruise
    I regulary need to transfer very large files (30 GB) to my friend - financial statistics. I don't have any problem with bandwidth: it is very broad here. I did some research in the area, so: 1. I would not use FTP, as it is very tricky to get it working behind a NAT. 2. I would not use Skype/MSN/ICQ, as it is not designed for file transfer and it underperforms on the huge files. 3. I would not use file-sharing services, as I need to pay for big files (30 GB is a problem here) and I don't like holding any piece of my data on the third-party server. So, I need some smart tool that will do what I need: sending files directly browser-to-browser and not browser-server-browser. Is it so complex? Is there some web application in the Internet that can do this?

    Read the article

  • Ubuntu from console/command-line/shell

    - by Xolve
    Earlies linux distros though required lot of manual work they were quite good to use from commandline. If the X-server didn't start or you just want a shell to work they all supported. Network was configured by init; sound was up and ready; new devices inserted would be configured and their configureation was placed in fstab. Also there were small scripts I found on many distros which on X used windows while on console they switched to ncurses. But now this all needs GUI with a desktop manager (KDE, GNOME) for the new paradigms :'-( require GUI (NetworkManger, hal etc.). So if on just command line you have to be root, looks like they believe only geeky admins need that, and need to edit config files or type big commands. Any way so that this is easy in Ubnubtu through shell again.

    Read the article

  • Update Billing Email Addresses in Sage MAS90

    - by ThaKidd
    I have been tasked with trying to find a method of updating about 500 customer email addresses in MAS90. I recently discovered that they had the ability to email invoices to their customers and because I opened my big mouth, they now want me to find a way to add about 500 emails into the system so they do not have to perform the task manually. The office does have an SQL server which supports ASP sites which contains a list of all of the current email addresses. My plan was to use Microsoft Access 2007 with the MAS90 ODBC connector to attempt this update. My questions are: Is this the right way to go or is there a better method of obtaining these results? Does anyone happen to know which tables I should be looking at? Any and all help is appreciated. Thanks in advance!

    Read the article

  • Lustre - is this bad form?

    - by ethrbunny
    Im going to be consolidating several 'server rooms' into a single installation soon. Part of this effort will be finding a home for 5Tb (and growing) of files / logs. To this end Im looking at Lustre and appreciating its ability to scale. The big vendors want to sell me a $20K SAN to manage this but Im wondering about buying several iSCSI units (like this http://www.asacomputers.com/3U-iSCSI-Solution.html) and using VMs for the OSS machines. This would let me fail-over to cover problems and not require a dedicated system for each OSS. Given articles like this (http://h30565.www3.hp.com/t5/Feature-Articles/RAID-Is-Dead-Long-Live-RAID/ba-p/1422) that talk about how RAID is not keeping up with drive density Im leaning towards more disks with lower capacity each. Again - some akin to the iSCSI array above. Tell me why this is a terrible idea. Do I really need to invest in a PE710 for each OSS/OST?

    Read the article

  • 4K sectors transition: Why are hard drives moving to 4096 byte sectors, vs. 512 byte sectors?

    - by Chris W. Rea
    I've noticed that some Western Digital hard drives are now sporting 4K sectors, that is, the sectors are larger: 4096 bytes vs. the long-standing standard of 512 bytes. So: What's the big deal with 4K sectors? Is it marketing hype, or a real advantage? Why should somebody building a new PC care, or not, about 4K sectors? Why is this transition taking place now? Why didn't it happen sooner? Are there things to look out for when buying a 4K sector hard drive? e.g. incompatibility? Anything else we should know about 4K sectors?

    Read the article

  • two identical broadband lines working as one

    - by Katafalkas
    I have been trying to find an answer to this, but all I get is hobbyists trying to connect they linksys's and get some magic out of it. So I am thinking of a way I could combine two 100Mbp Fiber Optics lines into a single connection for our office. I assume it involves some CISCO learning or something like this. Was thinking that I might need to configure some big router to load-balance the NAT'ing in some way. I assume that meny of you have done something similar and maybe someone could share the knowledge or at least provide some tips ?

    Read the article

  • Fan is spinning too fast just in Windows - software?

    - by B. Roland
    I've recently replaced my fans (CPU, GPU, and bought a CHA fan). The GPU remained the same, but I've seen it when it was spinned 2 times faster, than it usual... but it is rarely. The problem is, that the CPU fan in Windows (especially 7) spinned too much, 'cos it keeps in under 40°C, and it is spinning with 3300-3600 RPM, which is too high I think. If I swich to Ubuntu, it keeps on ~40-45°C with with 2500-2800 RPM, which is a big difference in numbers, and in noise. I'm looking for a manual fan control solution, or just reduce the Windows' multipliers of fan speed control, somehow... I was bought the new fans because of the lower noise (and it does it, but not with 3.6k RMP). Thank you!

    Read the article

  • Why is ext3 so slow to delete large files?

    - by Janis Peisenieks
    I have a server, which makes an incremental backup of a system every night. Now on saturdays, there is a full backup. But after the full backup has finished, a script kicks in, that deletes the incrementals. Now, the script sometimes breaks, and it is because the incrementals are each about 10GB files, and sometimes takes too long for the script. Now could someone explain to me, or point me in the direction of a resource, that explains why ext3 is so slow to delete files, when compared to, lets say, NTFS? I know theses are 2 completely different file systems, but I'm really interested why is there such a big difference in deletion?

    Read the article

< Previous Page | 156 157 158 159 160 161 162 163 164 165 166 167  | Next Page >