Search Results

Search found 5683 results on 228 pages for 'zend pdf'.

Page 178/228 | < Previous Page | 174 175 176 177 178 179 180 181 182 183 184 185  | Next Page >

  • Can I password protect a Publisher file?

    - by tombull89
    I was asked ealier this week if it was possible to password protect a Microsoft Office 2007 Publisher document. I was under the impression that it would be like protecting a Word document, by going to Office Save As Word Document Tools General Options and creating a password to modify, like shown below. This also works for Excel documents. However, in Publisher 2007 the option is not there. The only option under "Tools" is "Map network drive". We overcame the issue as saving as a PDF and distributing that, but is there a way to do what we want?

    Read the article

  • Allow users to view Word documents only and not be able to edit, copy or save them.

    - by Alexander
    Hello In a traditional Windows Server 2003 environment with AD, we have shared a folder for our policy documents (MS Word). These documents get edited/updated now and then by the administrator(principal of college). Users only have read-only access to the folder, but they can still save-as and then change the content. Sharepoint is a possible solution but not easy to implement. We also thought of using a CMS on Linux and installing Joomla to let users only view the docs with a document management system... but is it possible to automatically retrieve the policy folder on the network and convert or put it in a format that users can only view and not copy? We also thought of saving the docs to secure pdf format but the principal wants an automated system. Basically she just wants to work in Word and the policies must be available to staff members on the network. Any ideas? Much appreciated.

    Read the article

  • CS3, Illustrator - Where Do X/Y Coordinates Measure from?

    - by nicorellius
    I have a project where there are several PDF files. I'm using Illustrator to make these. It seems the point/line of origin is inconsistent from image to image (file to file). Where is the point of origin, by default, in CS3 Illustrator? It would be nice if, while I was positioning images, I could just say, "OK, x coordinate is 5.5 inches in this document, so it is 5.5 in that one." But it seems this is not the case. Anyone know how Illustrator sets these parameters?

    Read the article

  • Extract attachments from Mbox throw MIME

    - by Simeon
    I am a littlebit frustrated, im working on a project with the aim to build a system witch print automatically e-mail attachments of incoming mails ("E-Mail to Print"-system). I already set up a e-mail server (exim4) which receive perfectly e-mail and stores them to a mbox in /var/mail/ - now I want to extract the attachments out of the mbox file throw MIME to the original .PDF, .DOC, .JPG, .GIF, ... and save them in a directory, from where they get print. After the e-mail attachments got extracted they should be deleted, so they don't get extracted again. But how can I get this to work? I am not a coder, so I looked for existing scripts and programs but found nothing to work with. Could anyone give me little help - I would be very thankful! Thanks, Simeon

    Read the article

  • How do I insert a newline within a long formula using OpenOffice.org Math?

    - by Lekensteyn
    I've the following formula: (x^6+x^4+x^2+x+1)(x^7+x+1) = x^13+x^11+x^9+x^8+x^7+ x^7+x^5+x^3+x^3+x^2+x+ x^6+x^4+x^2+x+1 = x^13+x^11+x^9+x^8+2x^7+x^6+x^5+x^4+2x^3+2x^2+2x+1 Putting this in OpenOffice.org Math causes every line be concatenated, which I want to avoid. I've already tried putting newline between the lines, but it adds a strange question mark in the formula. Using matrices did not work for me either. I want to achieve a nicely formatted formula like this one (taken from FIPS 197 pdf):

    Read the article

  • How to configure remote access to multiple subnets behind a SonicWALL NSA 2400

    - by Kyle Noland
    I have a client that uses a SonicWALL NSA 2400 as their firewall. I need to setup a second LAN subnet for a handful of PC. Management has decided that there should be a second subnet even though intend to allow access across the two subnets - I know... I'm having trouble getting communication across the 2 subnets. I can ping each gateway, but I cannot ping or seem to route traffic fron subnet A to subnet B. Here is my current setup: X0 Interface: LAN zone with IP addres 192.168.1.1 X1 Interface: WAN zone with WAN IP address X2 Interface: LAN zone with IP address 192.168.75.1 I have configured ARP and routes for the secondar subnet (X2) according to this SonicWALL KB article: http://www.sonicwall.com/downloads/supporting_multiple_firewalled_subnets_on_sonicos_enhanced.pdf using "Example 1". At this point I don't minding if I have to throw the SonicWALL GVC software VPN client into the mix to make it work. It feel like I have an Access Rule issue, but for testing I made LAN LAN, WAN LAN and VPN LAN rules wide open with the same results.

    Read the article

  • How to force-restart a PC with vPro technology?

    - by Dan Nissenbaum
    I would like to know how to force-restart a PC that has crashed/hung and become completely non-responsive, using 2nd-generation vPro technology. Assume there is a second, fully responsive PC on the same LAN that can be accessed remotely to assist. Specifically, I am considering purchasing a PC with an i7-2860QM CPU, which is vPro-enabled (according to Intel). Here are two links that indicate it should be possible to force-restart a hung system with a 2nd-generation vPro-enabled CPU: Seconds 24-39 of What Is Intel vPro™ Technology? Page 17 (21 of the PDF) of Intel® vPro™ Technology: Reference Guide However, after extensive research, I cannot find a straightforward and trustworthy source of confirmation that this will actually work as I describe, or any documentation about how to set it up. I would appreciate both a reliable confirmation, and a source of documentation. This question is a follow-up to: Wake-on-LAN (WOL) fails after computer crashes (Windows 7 64-bit).

    Read the article

  • Does a Samsung G3 Station external hard drive stop working when power supply is too high?

    - by Cacovsky
    I have a Samsung G3 Station 2TB external hard drive (link to PDF specs here). It was working perfectly when I accidentally plugged it in my notebook's power source. The notebook's power source is 19V/3.42A. The hard drive's is 12V/2A and I know that, inside its case, there is regular 2TB SATA drive, along with some sort of adapter. Does this adapter has some kind of power protection? I opened the case and the hard drive board smells bad. Does my data is forever lost or can I replace its board?

    Read the article

  • Text comparison utility

    - by Aaron
    I know this has been asked before...but I have a spin as I have been trying out varying free software offerings. I want to rid out department of DiffDoc the problem is that I am having trouble locating something that will do what we need. WinMerge has been the latest attempt... The problem is simple. One Word doc...one PDF with a portion of it containing the text to be compared against. Compare them and be done. Raw text, ignore whitespace, ignore carriage returns, etc... Just compare the text and give me the results in some sort of report. NOTE: Have tried ExamDiff, kdiff3, Tortoise, and a few others...

    Read the article

  • Any opinions on Paragon's NTFS for Mac solution?

    - by AngryHacker
    I am currently using the free NTFS-3G to access my NTFS drive from the Mac. It seems pretty stable (except once in the very beginning, it locked up the Mac and corrupted my NTFS drive, which I then fixed with chkdsk from a PC). However, speed is NOT one of its virtues. I've been looking at buying Paragon NTFS for Mac OS X 7.0. Their product comparison PDF claims nearly double the speed (vs NTFS-3G) in almost every category (read, write, etc...). Can someone tell me whether the product is stable and whether the speed claims are true.

    Read the article

  • Warning: Cannot modify header information - headers already sent

    - by Pankaj Khurana
    Hi, I am using code available on http://www.forosdelweb.com/f18/zip-lib-php-archivo-zip-vacio-431133/ for creating zip file. First file-zip.lib.php <?php /* $Id: zip.lib.php,v 1.1 2004/02/14 15:21:18 anoncvs_tusedb Exp $ */ // vim: expandtab sw=4 ts=4 sts=4: /** * Zip file creation class. * Makes zip files. * * Last Modification and Extension By : * * Hasin Hayder * HomePage : www.hasinme.info * Email : [email protected] * IDE : PHP Designer 2005 * * * Originally Based on : * * http://www.zend.com/codex.php?id=535&single=1 * By Eric Mueller <[email protected]> * * http://www.zend.com/codex.php?id=470&single=1 * by Denis125 <[email protected]> * * a patch from Peter Listiak <[email protected]> for last modified * date and time of the compressed file * * Official ZIP file format: http://www.pkware.com/appnote.txt * * @access public */ class zipfile { /** * Array to store compressed data * * @var array $datasec */ var $datasec = array(); /** * Central directory * * @var array $ctrl_dir */ var $ctrl_dir = array(); /** * End of central directory record * * @var string $eof_ctrl_dir */ var $eof_ctrl_dir = "\x50\x4b\x05\x06\x00\x00\x00\x00"; /** * Last offset position * * @var integer $old_offset */ var $old_offset = 0; /** * Converts an Unix timestamp to a four byte DOS date and time format (date * in high two bytes, time in low two bytes allowing magnitude comparison). * * @param integer the current Unix timestamp * * @return integer the current date in a four byte DOS format * * @access private */ function unix2DosTime($unixtime = 0) { $timearray = ($unixtime == 0) ? getdate() : getdate($unixtime); if ($timearray['year'] < 1980) { $timearray['year'] = 1980; $timearray['mon'] = 1; $timearray['mday'] = 1; $timearray['hours'] = 0; $timearray['minutes'] = 0; $timearray['seconds'] = 0; } // end if return (($timearray['year'] - 1980) << 25) | ($timearray['mon'] << 21) | ($timearray['mday'] << 16) | ($timearray['hours'] << 11) | ($timearray['minutes'] << 5) | ($timearray['seconds'] >> 1); } // end of the 'unix2DosTime()' method /** * Adds "file" to archive * * @param string file contents * @param string name of the file in the archive (may contains the path) * @param integer the current timestamp * * @access public */ function addFile($data, $name, $time = 0) { $name = str_replace('', '/', $name); $dtime = dechex($this->unix2DosTime($time)); $hexdtime = 'x' . $dtime[6] . $dtime[7] . 'x' . $dtime[4] . $dtime[5] . 'x' . $dtime[2] . $dtime[3] . 'x' . $dtime[0] . $dtime[1]; eval('$hexdtime = "' . $hexdtime . '";'); $fr = "\x50\x4b\x03\x04"; $fr .= "\x14\x00"; // ver needed to extract $fr .= "\x00\x00"; // gen purpose bit flag $fr .= "\x08\x00"; // compression method $fr .= $hexdtime; // last mod time and date // "local file header" segment $unc_len = strlen($data); $crc = crc32($data); $zdata = gzcompress($data); $zdata = substr(substr($zdata, 0, strlen($zdata) - 4), 2); // fix crc bug $c_len = strlen($zdata); $fr .= pack('V', $crc); // crc32 $fr .= pack('V', $c_len); // compressed filesize $fr .= pack('V', $unc_len); // uncompressed filesize $fr .= pack('v', strlen($name)); // length of filename $fr .= pack('v', 0); // extra field length $fr .= $name; // "file data" segment $fr .= $zdata; // "data descriptor" segment (optional but necessary if archive is not // served as file) $fr .= pack('V', $crc); // crc32 $fr .= pack('V', $c_len); // compressed filesize $fr .= pack('V', $unc_len); // uncompressed filesize // add this entry to array $this -> datasec[] = $fr; // now add to central directory record $cdrec = "\x50\x4b\x01\x02"; $cdrec .= "\x00\x00"; // version made by $cdrec .= "\x14\x00"; // version needed to extract $cdrec .= "\x00\x00"; // gen purpose bit flag $cdrec .= "\x08\x00"; // compression method $cdrec .= $hexdtime; // last mod time & date $cdrec .= pack('V', $crc); // crc32 $cdrec .= pack('V', $c_len); // compressed filesize $cdrec .= pack('V', $unc_len); // uncompressed filesize $cdrec .= pack('v', strlen($name) ); // length of filename $cdrec .= pack('v', 0 ); // extra field length $cdrec .= pack('v', 0 ); // file comment length $cdrec .= pack('v', 0 ); // disk number start $cdrec .= pack('v', 0 ); // internal file attributes $cdrec .= pack('V', 32 ); // external file attributes - 'archive' bit set $cdrec .= pack('V', $this -> old_offset ); // relative offset of local header $this -> old_offset += strlen($fr); $cdrec .= $name; // optional extra field, file comment goes here // save to central directory $this -> ctrl_dir[] = $cdrec; } // end of the 'addFile()' method /** * Dumps out file * * @return string the zipped file * * @access public */ function file() { $data = implode('', $this -> datasec); $ctrldir = implode('', $this -> ctrl_dir); return $data . $ctrldir . $this -> eof_ctrl_dir . pack('v', sizeof($this -> ctrl_dir)) . // total # of entries "on this disk" pack('v', sizeof($this -> ctrl_dir)) . // total # of entries overall pack('V', strlen($ctrldir)) . // size of central dir pack('V', strlen($data)) . // offset to start of central dir "\x00\x00"; // .zip file comment length } // end of the 'file()' method /** * A Wrapper of original addFile Function * * Created By Hasin Hayder at 29th Jan, 1:29 AM * * @param array An Array of files with relative/absolute path to be added in Zip File * * @access public */ function addFiles($files /*Only Pass Array*/) { foreach($files as $file) { if (is_file($file)) //directory check { $data = implode("",file($file)); $this->addFile($data,$file); } } } /** * A Wrapper of original file Function * * Created By Hasin Hayder at 29th Jan, 1:29 AM * * @param string Output file name * * @access public */ function output($file) { $fp=fopen($file,"w"); fwrite($fp,$this->file()); fclose($fp); } } // end of the 'zipfile' class ?> My second file newzip.php <? include("zip.lib.php"); $ziper = new zipfile(); $ziper->addFiles(array("index.htm")); //array of files // the next three lines force an immediate download of the zip file: header("Content-type: application/octet-stream"); header("Content-disposition: attachment; filename=test.zip"); echo $ziper -> file(); ?> I am getting this warning while executing newzip.php Warning: Cannot modify header information - headers already sent by (output started at E:\xampp\htdocs\demo\zip.lib.php:233) in E:\xampp\htdocs\demo\newzip.php on line 6 I am unable to figure out the reason for the same. Please help me on this. Thanks

    Read the article

  • Enable file download via redirect in IE7

    - by Christian W
    Our application enables our customers to download files to their computer. The way I have implemented it is using asp.net with a dropdown. When the user clicks the dropdown they get the choice of "PDF","Powerpoint", and a couple of other choices depending on circumstances. Then, in postback depending on the choice the user made, it will return a file (changing the content-header and such and then bitbanging a file to the user). This works perfectly in all browsers, but IE7 complains that this is a security risk and blocks the download. Is there any way for the users to authorize downloads from our webapplication?

    Read the article

  • php rsync with exec() not working

    - by mojeime
    Why this: rsync -avz -e ssh /home/userneme/folder [email protected]:/var/www/folder works from cronjob and this: exec("rsync -avz -e ssh /home/userneme/folder [email protected]:/var/www/folder"); doesn't work. I know exec is working because i have a few places in my appp that do convercion from pdf to jpg with ImageMagick (exec). SOLVED exec is working OK it was a permission issue on remote server. "Local" server is shared reseller account and remote server is my first VPS Ubuntu 10.10 LAMP box. If only I had a system administrator since i'm just a software developer forced to do this and i stink at it :) Thank You all!

    Read the article

  • What is the lease irriating printer manufacturer?

    - by aireq
    Currently I have a Lexmark all in one printer/scanner which has some of the worse drivers I've seen for a printer. The installation takes forever. Then once it's installed the printer will only work if I keep the "Lexmark Productivity Studio" running in my system tray. Then later after I've scanned something 99% of the time the "Save to PDF" button doesn't do anything when I click it. It is also a wireless printer, but of course the only way to set any of the wireless settings is during the driver setup. So if my WEP key changes then I have to go off and reinstall the entire printer driver. Lately I tried refilling one of the ink cartridges with a key I bought off amazon, and now both the printer and the drivers keep complaining about being out of "Official Lexmark Ink" This comic from The Oatmeal pretty much sums up my feelings about consumer printers and their drivers. This question is, of course, pretty subjective but I'd like to know what (if any) consumer printer brands actual provide quality drivers and software with their printers.

    Read the article

  • I can access \\server via explorer but a program wont

    - by Michael Savage
    From ServerA I can access \\ServerB\Telephony\Files\abcdefgh.pdf using windows explorer. From the same ServerA when I try to access the same file on ServerB using a program (a program that imports files from csv file) I get "File Not Found" error. On \\ServerB\Telephony\ the Share is on and I added the service account that I used to log in to ServerA. I am clueless. Please suggest. (oh, it's a Windows 2008 R2 Server) (btw, I did try IP Address, FQDN but works with Explorer but CVS Importer wont read the path. At one time, I did get Access Denied but I dont get access denied anymore after adding the service account to the share. firewalls are off on the servers) Update: I go to My Computer Network I see many servers but ServerB is not in the list..

    Read the article

  • How to remove line breaks (or carriage returns) only from certain parts of a block of text?

    - by Luke Allen
    Whenever I copy formatted text from a PDF file which is formatted to have line breaks (or carriage returns), I need to find a way to remove these line breaks without removing the paragraph format. To do this I need to use RegEx (Regular expressions) to only remove the line breaks which aren't preceded by a period. So for example, if a string of text has a line break right after a period, that is obviously almost always a legitimate line break which will start a new paragraph. If a string of text has a line break mid-word or after a word with no period, it's simply part of the bad formatting I need to get rid of. My problem is that I don't know how to use RegEx to make it only remove the ^p tags in word or CRLF or line breaks in any format under the conditions that it omits ones following a period.

    Read the article

  • Internet access using Edimax BR-6204WG as NIC

    - by Mat Richardson
    My internet access at home is provided by Virgin Media via their superhub. I have a laptop with no NIC - however I do have a spare wireless router, the Edimax BR-6204WG, which I have been led to believe can be used to bridge wireless connections. Only problem is, I'm not sure how to go about doing this. The manual for the device is here:- http://www.edimax.co.uk/images/Image/manual/Wireless/BR-6204Wg/BR-6204Wg_Manual.pdf Basically, I want to be able to connect the Edimax wireless router to my laptop using ethernet cable and to use it to pick up the wireless connection from my Virgin superhub. I've managed to get so far in some ways, but then I'm stuck.

    Read the article

  • Apache/PHP serving file multiple times

    - by easement
    I have a system with a download.php page. The page takes and id and loads a file based on from the DB Record and then serves it up. I've noticed a couple instances where files are requested multiple times in short time spans (20ms). Times that are too quick for human input. There are plenty of instances where the downloader functions fine. However, in taking a closer look at the downloader’s usage, I did see some interesting behavior. For instance, the IP address xxx.xxx.xxx.xxx (which is one in a range owned by xxxxxx.de in Germany) came to the site through Google. They browsed around and then came to the page http://site.com/xxxx/press+125.php There they issued a request for /download.php?id=/ZZ/n+aH55Y= (a PDF) at 9:04:23AM. That alone is not a big deal. However, what is interesting is that the server seems to have been quite preoccupied with serving that request. In the logs the request first completes between 9:09:48 and 9:10:00. It looks like the user must have gotten tired of waiting during that time and requested the document two more times. Between 09:14:47 and 09:15:00 the same request appears again, except it is from 9:04:43AM, 20ms later than the first request. Then it pops up a third time, with a request that started at 09:05:06 completing between 09:19:55 and 09:19:58! I’m suspicious of that document. In looking through the logs I see other instances where it takes the server a little while to handle that specific file. Check out this list of requests from zzz.zzz.zzz.zzz[different than above] for the file /download.php?id=/ZZ/n+aH55Y= (the same docuemnt as before): Request time Complete Time 04:32:43 04:33:36 04:32:50 04:33:36 04:32:51 04:33:38 04:33:05 04:33:38 04:33:34 04:33:42 04:33:05 04:33:42 So something is definitely going on. Whether it has to do with this specific document tripping up the server, the download.php page’s code, or if we’re just seeing the evidence of some server level overload as it plays out in real time I’m not yet sure. In fairness, there are other instances of people downloading /download.php?id=/ZZ/n+aH55Y= (the same PDF) without error. However, it is interesting that the multiple processes only seem to happen with this one file, and then only when it is accessed through the page http://site.com/press+125.php . It bears further investigation if there’s something amiss inside the code that causes the system to fire off multiple download requests that occupy the server. I don't know if this press+125.php is a rabbit hole, but there is weird consicence. Any ideas? I'm totally out of ideas. Apache maxed out? Things like that. ///DOWNLOAD.php $file = new files(); $file->comparison_filter("id", "=", $id); //sql to load if ($file->load()) { $file->serve(); } //FILES function serve() { if ($this->is_loaded) { if (file_exists($this->get_value("filename"))) { if ($this->get_value("content_type") != "") { header("Content-Type: " . $this->get_value("content_type")); } header("Content-Length: " . filesize($this->get_value("filename"))); if ($this->get_value("flag_image") == 0 || $this->get_value("flag_image") == false) { header("Cache-Control: private"); header("Content-Disposition: attachment; filename=" . urlencode($this->get_value("original_filename"))); } set_time_limit(0); @readfile($this->get_value("filename")); exit; } } }

    Read the article

  • A download manager for Linux which saves downloaded files in directories by date like 2012_06_29

    - by Gart
    I've been using Download Master on Windows for years and what I liked most about it is that this program can automatically put downloaded files into directories by download date: /Downloads | |--/2012_06_28 | | | |--a.zip | |--b.pdf | ... | |--/2012_06_29 | | | |--c.txt | ... ... I'm looking for something similar for Linux. Is there any free download manager that can do this? I have tried KGet and uGet but they both seem to lack this feature. If there is a way to configure them to do that, I'll be happy to know about it. Thank you.

    Read the article

  • "Brute force attempt" on sending multiple emails

    - by bretddog
    While testing to send multiple emails, I successfully sent about 100 emails (with a 20KB pdf attachment), to the same email-address (my own), and they were all received. But on next attempt, my cPanel account was blocked, due to a "brute force attempt". Are there any special precautions I need to take when sending bulk emails? I simply looped through below code without pause for each email. What type of alert could that give on the email server, and how should I avoid it? client = New SmtpClient(smtp, Convert.ToInt32(port)) AddHandler client.SendCompleted, AddressOf OnAsyncSendComplete client.Credentials = New System.Net.NetworkCredential(usn, psw) client.SendAsync(mail, token) Should I wait for SendComplete event for each email before sending the next?

    Read the article

  • IE 8 doesn't appear to clear cache on demand. Is anyone else seeing this?

    - by Steve
    I have a client that uploads updated pdf files to her Concrete5 CMS, through the file manager, replacing the old file with the same name. She then does a cms "clear cache" and exits as she should. Then, in testing, she finds that the old file still comes up when clicking on the link. On further review, the cms file manager version tracking shows that the file has been updated, and, for me, the new file comes up, as it should, when clicking the link. My client hase also refreshed her browser cache and still, she only gets the old file when clicking on the link. She says that, while she can't seem to force an immediate cache update, overnight it appears to update. My client is also part of a large company-wide lan and intranet. Is it possible that there is a cache function placed outside of her local browser and cms cache that is not updating?

    Read the article

  • mod rewrite works fine apart from for missing directory index files

    - by j w
    I have a legacy web site hosted on Apache. It has a number of web pages sitting in the public web root and its subfolders. publicDocs/ directorywith_no_defaultfile/ some-legacy-flat-page.htm .htaccess index.php some-legacy-flat-page.htm I would like to start using Zend MVC for some of the newer pages. I have got a .htaccess mod rewrite rule working so that any request for a non-existent file is sent to be handled by the MVC bootstrap file (/index.php). With my current set-up, the following types of requests are routed to '/index.php', the MVC bootstrap: /index.php /blah /directorywith_no_defaultfile/bloo The following types of request are served by old legacy (flat) pages /some-legacy-flat-page.htm /directorywith_no_defaultfile/some-legacy-flat-page.htm But, when I a request a non-existent file that is a directory like these: /directorywith_no_defaultfile or /directorywith_no_defaultfile/ I get an error: Forbidden You don't have permission to access /directorywith_no_defaultfile/ on this server. Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request. I suspect this may have something to do with the way Apache handles default files. Do you know which Apache directives could be causing this?

    Read the article

  • How much HDD space would I need to cache the web while respecting robot.txts?

    - by Koning Baard XIV
    I want to experiment with creating a web crawler. I'll start with indexing a few medium sized website like Stack Overflow or Smashing Magazine. If it works, I'd like to start crawling the entire web. I'll respect robot.txts. I save all html, pdf, word, excel, powerpoint, keynote, etc... documents (not exes, dmgs etc, just documents) in a MySQL DB. Next to that, I'll have a second table containing all restults and descriptions, and a table with words and on what page to find those words (aka an index). How much HDD space do you think I need to save all the pages? Is it as low as 1 TB or is it about 10 TB, 20? Maybe 30? 1000? Thanks

    Read the article

  • Error headers: ap_headers_output_filter() after putting cache header in htaccess file

    - by Brad
    Receiving error: [debug] mod_headers.c(663): headers: ap_headers_output_filter() after I included this within the htaccess file: # 6 DAYS <FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|js|css|swf)$"> Header set Cache-Control "max-age=518400, public" </FilesMatch> # 2 DAYS <FilesMatch "\.(xml|txt)$"> Header set Cache-Control "max-age=172800, public, must-revalidate" </FilesMatch> # 2 HOURS <FilesMatch "\.(html|htm)$"> Header set Cache-Control "max-age=7200, must-revalidate" </FilesMatch> Any help is appreciated as to what I could do to fix this?

    Read the article

  • How to find out what is causing a slow down of the application on this server?

    - by Jan P.
    This is not the typical serverfault question, but I'm out of ideas and don't know where else to go. If there are better places to ask this, just point me there in the comments. Thanks. Situation We have this web application that uses Zend Framework, so runs in PHP on an Apache web server. We use MySQL for data storage and memcached for object caching. The application has a very unique usage and load pattern. It is a mobile web application where every full hour a cronjob looks through the database for users that have some information waiting or action to do and sends this information to a (external) notification server, that pushes these notifications to them. After the users get these notifications, the go to the app and use it, mostly for a very short time. An hour later, same thing happens. Problem In the last few weeks usage of the application really started to grow. In the last few days we encountered very high load and doubling of application response times during and after the sending of these notifications (so basically every hour). The server doesn't crash or stop responding to requests, it just gets slower and slower and often takes 20 minutes to recover - until the same thing starts again at the full hour. We have extensive monitoring in place (New Relic, collectd) but I can't figure out what's wrong; I can't find the bottlekneck. That's where you come in: Can you help me figure out what's wrong and maybe how to fix it? Additional information The server is a 16 core Intel Xeon (8 cores with hyperthreading, I think) and 12GB RAM running Ubuntu 10.04 (Linux 3.2.4-20120307 x86_64). Apache is 2.2.x and PHP is Version 5.3.2-1ubuntu4.11. If any configuration information would help analyze the problem, just comment and I will add it. Graphs info phpinfo() apc status memcache status collectd Processes CPU Apache Load MySQL Vmem Disk New Relic Application performance Server overview Processes Network Disks (Sorry the graphs are gifs and not the same time period, but I think the most important info is in there)

    Read the article

< Previous Page | 174 175 176 177 178 179 180 181 182 183 184 185  | Next Page >