Search Results

Search found 59995 results on 2400 pages for 'web experience management'.

Page 441/2400 | < Previous Page | 437 438 439 440 441 442 443 444 445 446 447 448  | Next Page >

  • Snow Leopard Server, how to turn on compression for web

    - by gbrandt
    I am trying to make the webserver in Snow Leopard compress all output by default. The only thing I have found is to add SetOutputFilter DEFLATE in the .htaccess file for a directory. I really don't want to add an .htaccess file to every directory served. How can I globally get Apache2 on Snow Leopard to compress output?

    Read the article

  • How to revert AIX service pack?

    - by unknown (google)
    I have an AIX 6.1 box with service pack level 6100-02-03-0909 swax23 # oslevel -s -q Known Service Packs 6100-02-03-0909 6100-02-02-0849 6100-02-01-0847 6100-02-00-0000 6100-01-04-0909 6100-01-03-0846 6100-01-02-0834 6100-01-01-0823 6100-00-08-0909 6100-00-07-0846 6100-00-06-0834 6100-00-05-0822 6100-00-04-0815 6100-00-03-0808 6100-00-02-0750 6100-00-01-0748 I want to revert the latest serive pack and go back to 6100-02-02-0849. How to do it?

    Read the article

  • Why the system information message when accessing an Ubuntu server doesn't match free -m?

    - by Andres
    Each time I SSH into my AWS Ubuntu servers I see a system information message, showing load, memory usage and packages available to install, like this: Welcome to Ubuntu 12.04.3 LTS (GNU/Linux 3.2.0-51-virtual x86_64) * Documentation: https://help.ubuntu.com/ System information as of Sun Nov 10 18:06:43 EST 2013 System load: 0.08 Processes: 127 Usage of /: 4.9% of 98.43GB Users logged in: 1 Memory usage: 69% IP address for eth0: 10.236.136.233 Swap usage: 100% Graph this data and manage this system at https://landscape.canonical.com/ 13 packages can be updated. 0 updates are security updates. Get cloud support with Ubuntu Advantage Cloud Guest http://www.ubuntu.com/business/services/cloud Use Juju to deploy your cloud instances and workloads. https://juju.ubuntu.com/#cloud-precise *** /dev/xvda1 will be checked for errors at next reboot *** *** System restart required *** My question is about the memory percentage shown. In this case, it's showing a 69% of memory usage, but since the swap usage was 100% I checked it by myself. So when I run free -m I get this: total used free shared buffers cached Mem: 1652 1635 17 0 4 29 -/+ buffers/cache: 1601 51 Swap: 895 895 0 And that's of course closer to 100% than to 69%

    Read the article

  • ASP.NET Security Exception when Switch IIS7 to Use UNC Path for Content

    - by Jeremy H.
    I have a Windows Server 2008 R2 box running IIS7.5 with Medium Trust configured for ASP.NET. When I have the website running from local content (e.g.: c:\inetpub\wwwroot) everything works fine. When I change IIS to use a UNC path for the content (e.g.: \\computer\wwwroot) I get the following error: Security Exception Description: The application attempted to perform an operation not allowed by the security policy. To grant this application the required permission please contact your system administrator or change the application's trust level in the configuration file. Exception Details: System.Security.SecurityException: Request for the permission of type 'System.Data.SqlClient.SqlClientPermission, System.Data, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' failed. I'm trying to figure out why ASP.NET/IIS would allow for the SQL call when using local content but not when using a UNC path. Any ideas what I need to do to use a UNC path from IIS7 properly?

    Read the article

  • ftp user and www-data

    - by andrii
    I have a websites that should be completely accessible for Apache(www-data user) and I have an ftp user(that has the same username on the server: user1) that should be able to update the website. I need the user to be able to add files to the website directory, but the www-data should have a full access to the added files. user1 is a member of www-data group. For now when user1 uploads file to the server the file has a rights: 644, but I need 664. How I could organise it. Many thanks

    Read the article

  • Magento Community - Hosting :: Need Advice which sharing hosting will run magento fast

    - by user43353
    Hi, Need Advice which sharing hosting will run Magento Community fast or some other not expensive solution. This website will not have a lot of users, I only need that it will run fast for 100-20 users in same time. The problem with magento is database design that make this system very slow , also other staff not the best. I had hostmonster.com and justhost.com for previous website but it wasn't fast enough for single user that not located in USA (my customer areas: Asia, Africa). each action that involve database takes a lot of time. Thanks

    Read the article

  • yum simulate install

    - by Michael Irey
    Coming from an ubuntu perspective, if I want to check to see what additional packages will be installed/upgraded I can use apt-get --simulate install <package name> Is there something similar for yum? Our Red hat box (yum) is our production server, so I would like to see exactly what will be happening before I actually install some package. Couldn't really find a good solution, someone suggested: yum --assumeno install <package name> but this returned: Command line error: no such option: --assumeno Using yum version 3.2.22 Any ideas or suggestions would be welcome.

    Read the article

  • Why does jmeter not work?

    - by Foolish
    I use jmeter to record requests and then perform a performance test after it records all the requests with proxy server. These requests contain a post form. After that I run the test cases, but I found the post form doesn't work -- it cannot create a record in the website's database automatically. But before that I used Webload and everything was OK. What's the problem? What can I do for this?

    Read the article

  • amazon dynamoDB or MySQL for storing large arrays inside each row

    - by Logan Besecker
    I am trying to decide which database I should use for an application I'm making. I was leaning toward dynamoDB because of its scalability, but then I read in the documentation which said: there is a limit of 64 KB on the item size although it looks like MySQL has a similar restriction documented here This application will be storing a lot of data in two arrays, which could contain upwards of 10,000-100,000 strings in each. I estimate that these strings will each be somewhere around 20 characters long, so each element of the array will be around 40bytes and each array could be around 4MB. Given this predicament, what database on amazon AWS would you use; or how would you get around the limit of size per row? Thanks in advance, Logan Besecker

    Read the article

  • Want to install FlightGear but 'libapr1' dependency cannot be satisfied

    - by Jonathan Reno
    When I want to install the flightgear:amd64 package, it requests to install the simgear2.8.0:amd64 package onto my Linux Mint 13 KDE 64-bit system. But I cannot install simgear2.8.0:amd64 because I could not find it in the GetDeb repositories, could not install it from PlayDeb, and could not find a .deb file online. So, I tried to install simgear2.8.0:i386, but it wants me to install (read reinstall) libapr1, but it is already installed properly with its dependencies. By the way, libapr1 is necessary for my Apache installation. Could you help me fix my problem?

    Read the article

  • Exchange 2010 OWA - a few questions about using multiple mailboxes

    - by Alexey Smolik
    We have an Exchange 2010 SP2 deployment and we need that our users could access multiple mailboxes in OWA. The problem is that a user (eg John Smith) needs to access not just somebody else's (eg Tom Anderson) mailboxes, but his OWN mailboxes, e.g. in different domains: [email protected], [email protected], [email protected], etc. Of course it is preferable for the user to work with all of his mailboxes from a single window. Such mailboxes can be added as multiple Exchange accounts in Outlook, that works almost fine. But in OWA, there are problems: 1) In the left pane - as I've learned - we can open only Inbox folders from other mailboxes. No way to view all folders like in Outlook? 2) With Send-As permissions set, when trying to send a message from another address, that message is saved in the Sent Items folder of the mailbox that is opened in OWA, and not in the mailbox the message is sent from. The same thing with the trash can. Is there a way to fix that? Also, this problem exists in desktop Outlook when mailboxes are added automatically via the Auto Mapping feature, so that we need to turn it off and add the accounts manually. Is there a simpler workaround? 3) Okay, suppose we only open Inbox folders in the left pane. The problem is that the mailbox names shown there are formed from Display Name attributes. But those names are all identical! All the mailboxes are owned by John Smith, so they should be all named John Smith - so that letter recepient sees "John Smith" in the "from" field, no matter what mailbox it is sent from. Also, the user knows what's his name - no need to tell him. He wants to know what mailbox he works with. So we need a way to either: a) customize OWA to show mailbox email address instead of user Display Name, or b) make Exchange use another attribute to put in the "from" field when sending letters 4) Okay, we can switch between mailboxes using "Open Other Mailbox" in the upper-right corner menu. But: a) To select a mailbox we need to enter its name (or first letters). It there a way to show a list of links to mailboxes the user has full access to? Eg in the page header... b) If we start entering the first letters, we see a popup list with possible mailboxes to be opened. But there are all mailboxes (apparently from GAL), not only mailboxes the user has permission to open! How to filter that popup list? c) The same problem as in (3) with mailbox naming. We can see the opened mailbox email address ONLY in the page URL, which is insufficient for many users. In the left pane we see "John Smith" which is useless. 5) Each mailbox is tied with a separate user in AD. If one has several mailboxes, we need to have additional dummy AD accounts, create additional OUs to store them, etc. That's not very nice, is there any standartized, optimal way to build such a structure? We would really appreciate any answers or additional info for any of these questions. Thank you in advance.

    Read the article

  • How to generate customized sudoers files in puppet depending on the environment they're deployed to?

    - by gozu
    the sysadmins are present in the sudoers files of all environments, but other sudoers are not. Different environments all have slightly different sudoers. Most of the time, 90% of users are the same, and 10% vary so we cannot have only one sudoers file for everything. Right now, we are using puppet with 10 different files with names like sudoers.production1, sudoers.production2, sudoers.production3, sudoers.testing1, sudoers.staging1 and so forth. Puppet then picks the file to deploy based on the server's $domain (ex: dbserver.staging1.acme.com) or $hardwaremodel. It works fine but it's a nightmare to maintain so many files. I'd like to autogenerate sudoers files based on the server's domain and have only one big file with all the sudoers permissions for all users and all environments. Something that looks like: User_Alias ADMINS = abe, bob, carol, dave case $domain { "staging1.acme.com" { #add dev1,dev2,tester1,tester2 to sudoers file } "testing2.acme.com" { #add tester1, tester3, tester4 to sudoers file } What's the best way to go about this? Suggestions for alternatives are welcome. I'd appreciate any tips. Update 1: For security reasons, we'd rather not concatenate a bunch of files from a folder located on a puppet client in case someone puts a file in there (maliciously or not) and either breaks the combined file or inserts something in it. Most importantly, for usability, we'd like to keep the number of sudoers related files (fragment or complete) on puppet server to either 3 (prod/stage/test) or preferably 1 file. this file would (somehow) generate sudoers files on the puppet server and send one customized file to each puppet client. The purpose of this would be only searching for a username in a single file and removing it quicker than doing it on 11 files. When adding a user to a bunch of environments, it won't be as quick, but only one file would need to be opened and looked at, greatly reducing the chances of an omission. our Sudo version is 1.6.9p8 so we can't use /sudoers.d folder, only a sudoers file.

    Read the article

  • Turn Off Google Chrome Annoying Link Hover

    - by Volomike
    I like Google Chrome a lot, but there are two problems I have with it. The Address Bar search history feature that I want to turn off, and the feature where I hover over a hyperlink and in the bottom lefthand corner it shows a light blue hover tooltip about where that link goes. I really don't care where a given link goes and wish I could turn that feature off, even if I have to install a Google Chrome Extension. So, for these two reasons, I will not install Google Chrome as my primary browser and will remain with Firefox. So, is there a fix to turn the annoying link hover tooltip feature off?

    Read the article

  • Why do AWS spot-instance prices spike above the "on demand" pricing?

    - by Laykes
    Amazon Pricing on Spot Instance Inconsistencies This is something which will be best explained through screenshots of a historical chart of instance pricings. If you look at a lot of the instance prices for spot instances, you will notice regular patterns of spikes. See here: As you can see, the price for this compute medium instance, regularly spikes above the on demand price. A c1.medium instance (on demand), would only cost $0.186 per hour. But for a period of a few weeks, in zone B, the price would regularly spike to $1.20. This is some 6 times the actual on demand price. It's also not isolated. If you look at zone-b again for small instances, there is a similar, spike frequently. Which goes 4x the on demand pricing. Does anyone know why this happens? Here are a few suggestions Someone entered $1.2 instead of $0.12 (I would discount this since it happened 20 times over the space of 3 weeks). Amazon regularly artifically inflate their prices by bidding on their own instances to get the most bang for their buck. (I would discount this since it would be ridiculous and bad business) Some company launched 1000 servers at once, and wants to make sure that they all launch. (I would discount this since they would presumably launch them at a price which would be below the minimum on demand price. Why would you pay above on demand for a single server?). It's a bug in their reporting?

    Read the article

  • Amazon AWS EC2 + Puppet, get Puppet to know AWS instance tags

    - by Piotr Jasiulewicz
    I am having a problem with my AWS deployment, fairly new to AWS and Puppet. So coming to my question - can you distinguish puppet nodes with AWS machine tags or CNAME domains? A little background about the plan: have multiple clusters of machines, one php cluster, one legacy php cluster, one java cluster, one perl cluster control configuration with puppet - still pretty new to puppet but as a developer I like the idea of being able to version control configuration of servers have autoscaling enabled on those clusters - obviously the main benefit of the cloud that makes the much hight cost when it comes to any reasonable performance worth it (those amazon machines are slower than my phone...) deployment controlled by Capistrano, this makes things a lot easier So in AWS you get those super nasty public/private machine dns's... no way you can identify machines on those. In order to easer the problem, seams like AWS want's you to tag everything - so I did. Found a script that makes a CNAME record for each machine with the tag "ShortName" thanks to the Route53 API. Every machine has a ShortName tag that becomes its CNAME, unfortunately puppet still resolves the private dns name. I'd like to have node 'perl-cluster'{} in puppet, anyone any clue ho to achieve this? Thanks

    Read the article

  • Windows 7 Can't Get Past Format

    - by Soren
    Just assembled a new machine: MSI 880GM-E41 motherboard 500GB Samsung hard drive AMD Athlon II X4 640 processor 4GB RAM Windows 7 64 Bit LITE-ON Black 4X Blu-ray reader I boot up, start the install. When it gets to the screen to select what partition, the motherboard loses power. It doesnt make sense, I can leave it in the bios settings for a very long time and it doesnt lose power. When it does lose power, the monitor shuts down and the mouse loses power, but the power light stays on. I cannot tell if it is a harware or software problem.

    Read the article

  • Installing an EC2 (EBS based) instance on another AWS account

    - by imaginative
    We're moving from one AWS account to another. I have a couple of EC2 instances I would like to move over. These instances are EBS based. Googling around, I'm seeing all sorts of convoluted answers for how to appropriately launch this EBS based instance on a new account. Surely there has to be a simple way of going about this. As of right now, what I have done so far is backup the EB2 Volume as a snapshot. I then altered the permissions of the snapshot to allow the new AWS account number to be able to access it. I am not sure where to go from here. Do I have to create an image from the EBS snapshot? Upload it to S3? What's the next step(s)?

    Read the article

  • Does this file format exist?

    - by Jon Chase
    Is there a file format that handles the following use case... I'd like to create a tar file (or whatever - I'm just using tar here b/c it's a well known file format for containing multiple files) that would be usable even if I only had access to specific chunks of said file. For example, say I tar up my mp3 and photo collection into a 100GB tar file, then put the file into some long term storage somewhere. Later, I want to access a specific mp3 file. I don't want to download the entire 100GB tar file just to get to one mp3. In fact, let's say I can't download the entire 100GB tar file. Instead, I'd like to say "give me megabytes 10 through 19 of the 100GB tar file" and then have the mp3 magically extracted from those 10 megabytes. Does a file format like this exist?

    Read the article

  • Puppet - how can i copy a file to several user folders?

    - by Eliot Rocha
    Well i was using the info on this: Puppet - Any way to copy predefined custom configuration files for software on clients from the puppet master (host)? But i need some more elaborated, because i have several Desktops and are in use by 2 or 3 users each one, so i want to make a class for copy a shortcut in his desktops. The computers are joined to a domain, so any user can log in any desktop, and his profile is created in every desktop. I've tryed with this: class applink { file { "/home/installer/Escritorio/Workdesktop.desktop": owner => installer, group => root, mode => 770, source => "puppet://$server/files/Workdesktop.desktop" } This is only for one user called "installer", how can do this for several users? Can i use $USER for do this? Any Thoughts? Thank You!

    Read the article

  • Outlook 2007 / 2010 Calendar: hide meetings in specific category

    - by Jeroen
    Question Is there any easy way in Outlook 2007/2010 to show/hide meetings in a specific category? Preferably only for a specific view (the Month view, in this case). Note: I was almost done writing this question, adding just one more "What I've tried" option, when I found an acceptable (though imperfect) solution. Remembering this SE blog post I figured I might as well post it after all and answer it myself. And who knows, perhaps someone else has a more elegant solution. The reason for me personally is that I'd like to hide the "small, recurring meetings" like our daily stand-up meeting in the month view. I'd prefer an Outlook feature that is meant for this (there must be one for this, right?), but I'm open to workarounds or plugin suggestions as well. What I expected to find somewhere was a list of categories (with added option "No category") where you could select/deselect from which categories you'd see meetings. Something like this mock-up: What I've tried Edit "View Settings", and use a "Filter..." on categories. This has several disadvantages, the major one is that the filter only allows me to choose what I want to show, but not what I want to hide. Even if I tick all categories but one for the filter it would still hide any uncategorized meeting. Similar to 1, but then using Advanced filters. Still a bit clumsy as changing views can be up to three clicks, but this is the best solution so far (see the corresponding answer below). Creating a sub-calendar for these "small" meetings that I wish to hide. This felt a bit clumsy and like overkill, but did provide an easy "select/deselect" option to show/hide these meetings. Search for plug-ins that do this. Couldn't find one (yet).

    Read the article

< Previous Page | 437 438 439 440 441 442 443 444 445 446 447 448  | Next Page >