Search Results

Search found 21661 results on 867 pages for 'look alterno'.

Page 155/867 | < Previous Page | 151 152 153 154 155 156 157 158 159 160 161 162  | Next Page >

  • Install php extensions on ubuntu

    - by nute
    I have a Ubuntu 9.10 server. I have installed apache2 and php5 using the apt-get commands. How does one install php extensions? Are there commands like apt-get to get them? Or should I manually look for the files on the php website and set them up in the php.ini? More specifically, I need mcrypt, curl and gd. Thanks

    Read the article

  • Win7 playback of dvr-ms files stutters

    - by Jim Lynn
    I've just had to install Windows 7 on my Media Center machine because my Vista installation had a faulty drive. I've got the latest drivers that I can find - Intel 945GM integrated Graphics, Realtek audio drivers. Things are working OK with one exception. Playback of old recordings, from dvr-ms format files, is choppy. The picture freezes for a fraction of a second, then quickly catches up. The sound is uninterrupted and doesn't pause. These freezes happen once every 5 seconds or so. It's very regular. Playback of Live TV from the digital tuner is perfectly smooth. DVD playback is perfectly smooth. As an experiment, I used the MPEG editing package VideoReDo to create a small test file in three different formats. This program takes the raw MPEG streams and repackages them into the desired container. I took the same clip and created three files in three formats: dvr-ms (Microsoft's old recorded TV format); mpg (standard MPEG); and ts (raw MPEG transport stream of the kind often produced by PVRs). When these three files are played back under Windows 7, the mpg and ts files play smoothly, but the dvr-ms file stutters. The last piece of data I have is that two other Windows 7 machines can play back dvr-ms files smoothly with no stuttering. One is a netbook, with less grunt than the media centre. So there must be something specific about my Media Center machine that's causing the problem. Does anyone have any idea where I can look now? I don't know much about AV software, codecs, filter graphs etc. but I suspect that's where the problem lies. Rendering the video isn't the problem, but extracting the streams is. How would I go about diagnosing the problem? Edited to add: I just used the GraphStudio tool to look at the filter graph on the offending PC. The filter graph it uses by default for dvr-ms looks identical to the other machines, and, interestingly, when I play the files using GraphStudio they run smoothly. Under Windows Media Player and Windows Media Center they stutter. I'd like to see the filter graph for WMP but GraphStudio won't show it. It looks like WMP and WMC are using a different decoding path to GraphStudio. Edited again to add: Today I purchased a new HDTV. The same Media Center driving the TV at 1080p is now playing back the old Recorded TV files smoothly, without stuttering. So whatever the cause of the original problem, using a different resolution seems to have removed the problem. It might also explain why nobody else has had this problem. I doubt many people use Media Centre with a 14in portable TV.

    Read the article

  • Arch linux, openbox Monaco font problem

    - by z33m
    Im trying to setup a minimal desktop with openbox window manager on Arch linux. I noticed these weird font rendering issues with Monaco font. Below font size 13, alternate font sizes are rendered in an aliased, ugly manner. The same Arch installation has no problem rendering Monaco font when running under xfce. some of the characters even look completely different. I tried tweaking my .fonts.conf, but no luck.

    Read the article

  • NginX GeoIP cause extra load?

    - by Miko
    Because Nginx requires the geoip_ directives to go into the main http{ } block of the nginx.conf file, does that mean the geoip data is being pulled for every single request? In other words, does EngineX look up the geoip data for ALL of the requests coming in, even for those not needing the data? Also, nginx's documentation page lists "geoip_country" as a valid variable but if I use it, EngineX throws the following error: [emerg]: unknown "geoip_country" variable

    Read the article

  • Postfix smtp relay

    - by plucked
    Hi there, I want to setup a second smtp server to receive emails when the primary server is down. The primary server is a configured by OS X Server, the secondary is a Postfix setup on Debian. The secondary is able to accept emails, but how to I push to the primary server? I need a little hint where I have to look at the configuration (I do the most stuff with webmin (o: ) Thank you... Rainer

    Read the article

  • how to mount a partition inside a partition

    - by facha
    Hello, everyone I have a block device (/dev/sda5) that has been partitioned inside by a virtual machine. So, when I look inside with fdisk /dev/sda5, I see: sda5p1 sda5p2 and so on. Is it possible to mount them on my host system? Thanks in advance.

    Read the article

  • SquirrelMail (Courier) IMAP Issue

    - by Nik
    Alright, so I'm having this issue with SquirrelMail and Courier IMAP. When I try to login to SM, it throws this error at me: ERROR: Connection dropped by IMAP server. The IMAP server is running on 993 without SSL (which might be the problem). How do I fix this, and I've already taken a look at the official documentation in relation to this error with no fix.

    Read the article

  • git post-receive hook throws "command not found" error but seems to run properly and no errors when run manually

    - by Ben
    I have a post-receive hook that runs on a central git repository set up with gitolite to trigger a git pull on a staging server. It seems to work properly, but throws a "command not found" error when it is run. I am trying to track down the source of the error, but have not had any luck. Running the same commands manually does not produce an error. The error changes depending on what was done in the commit that is being pushed to the central repository. For instance, if 'git rm ' was committed and pushed to the central repo the error message will be "remote: hooks/post-receive: line 16: Removed: command not found" and if 'git add ' was committed and pushed to the central repo the error message will be "remote: hooks/post-receive: line 16: Merge: command not found". In either case the 'git pull' run on the staging server works correctly despite the error message. Here is the post-receive script: #!/bin/bash # # This script is triggered by a push to the local git repository. It will # ssh into a remote server and perform a git pull. # # The SSH_USER must be able to log into the remote server with a # passphrase-less SSH key *AND* be able to do a git pull without a passphrase. # # The command to actually perform the pull request on the remost server comes # from the ~/.ssh/authorized_keys file on the REMOTE_HOST and is triggered # by the ssh login. SSH_USER="remoteuser" REMOTE_HOST="staging.server.com" `ssh $SSH_USER@$REMOTE_HOST` # This is line 16 echo "Done!" The command that does the git pull on the staging server is in the ssh user's ~/.ssh/authorized_keys file and is: command="cd /var/www/staging_site; git pull",no-port-forwarding,no-X11-forwarding,no-agent-forwarding, ssh-rsa AAAAB3NzaC1yc2EAAAABIwAA... (the rest of the public key) This is the actual output from removing a file from my local repo, committing it locally, and pushing it to the central git repo: ben@tamarack:~/thejibe/testing/web$ git rm ./testing rm 'testing' ben@tamarack:~/thejibe/testing/web$ git commit -a -m "Remove testing file" [master bb96e13] Remove testing file 1 files changed, 0 insertions(+), 5 deletions(-) delete mode 100644 testing ben@tamarack:~/thejibe/testing/web$ git push Counting objects: 3, done. Delta compression using up to 2 threads. Compressing objects: 100% (2/2), done. Writing objects: 100% (2/2), 221 bytes, done. Total 2 (delta 1), reused 0 (delta 0) remote: From [email protected]:testing remote: aa72ad9..bb96e13 master -> origin/master remote: hooks/post-receive: line 16: Removed: command not found # The error msg remote: Done! To [email protected]:testing aa72ad9..bb96e13 master -> master ben@tamarack:~/thejibe/testing/web$ As you can see the post-receive script gets to the echo "Done!" line and when I look on the staging server the git pull has been successfully run, but there's still that nagging error message. Any suggestions on where to look for the source of the error message would be greatly appreciated. I'm tempted to redirect stderr to /dev/null but would prefer to know what the problem is.

    Read the article

  • git post-receive hook throws "command not found" error but seems to run properly and no errors when run manually

    - by Ben
    I have a post-receive hook that runs on a central git repository set up with gitolite to trigger a git pull on a staging server. It seems to work properly, but throws a "command not found" error when it is run. I am trying to track down the source of the error, but have not had any luck. Running the same commands manually does not produce an error. The error changes depending on what was done in the commit that is being pushed to the central repository. For instance, if 'git rm ' was committed and pushed to the central repo the error message will be "remote: hooks/post-receive: line 16: Removed: command not found" and if 'git add ' was committed and pushed to the central repo the error message will be "remote: hooks/post-receive: line 16: Merge: command not found". In either case the 'git pull' run on the staging server works correctly despite the error message. Here is the post-receive script: #!/bin/bash # # This script is triggered by a push to the local git repository. It will # ssh into a remote server and perform a git pull. # # The SSH_USER must be able to log into the remote server with a # passphrase-less SSH key *AND* be able to do a git pull without a passphrase. # # The command to actually perform the pull request on the remost server comes # from the ~/.ssh/authorized_keys file on the REMOTE_HOST and is triggered # by the ssh login. SSH_USER="remoteuser" REMOTE_HOST="staging.server.com" `ssh $SSH_USER@$REMOTE_HOST` # This is line 16 echo "Done!" The command that does the git pull on the staging server is in the ssh user's ~/.ssh/authorized_keys file and is: command="cd /var/www/staging_site; git pull",no-port-forwarding,no-X11-forwarding,no-agent-forwarding, ssh-rsa AAAAB3NzaC1yc2EAAAABIwAA... (the rest of the public key) This is the actual output from removing a file from my local repo, committing it locally, and pushing it to the central git repo: ben@tamarack:~/thejibe/testing/web$ git rm ./testing rm 'testing' ben@tamarack:~/thejibe/testing/web$ git commit -a -m "Remove testing file" [master bb96e13] Remove testing file 1 files changed, 0 insertions(+), 5 deletions(-) delete mode 100644 testing ben@tamarack:~/thejibe/testing/web$ git push Counting objects: 3, done. Delta compression using up to 2 threads. Compressing objects: 100% (2/2), done. Writing objects: 100% (2/2), 221 bytes, done. Total 2 (delta 1), reused 0 (delta 0) remote: From [email protected]:testing remote: aa72ad9..bb96e13 master -> origin/master remote: hooks/post-receive: line 16: Removed: command not found # The error msg remote: Done! To [email protected]:testing aa72ad9..bb96e13 master -> master ben@tamarack:~/thejibe/testing/web$ As you can see the post-receive script gets to the echo "Done!" line and when I look on the staging server the git pull has been successfully run, but there's still that nagging error message. Any suggestions on where to look for the source of the error message would be greatly appreciated. I'm tempted to redirect stderr to /dev/null but would prefer to know what the problem is.

    Read the article

  • Forefront 2010 Antispam vs Exchange 2010 Antispam?

    - by Jon
    They look pretty similar, do they work together or independently? For example you have content filtering in Forefront where you can specify SCL barriers, just like in Exchange. However theres no where to specify the Spam mailbox. So will the spam mailbox still be used if I configure this in Forefront?

    Read the article

  • Apache 2.2 with Tomcat

    - by Andrea Baccega
    Hello there, i'm trying to set up a dev environment with apache2.2 + tomcat + mysql. Of course i already have apache2.2 + mysql working fine with php but, when i look at google about how to setup tomcat, i find a lot of confusion. Someone uses proxy, someone rewriterules and so on... Could you please give me some info/links about how to accomplish this task? Bests, Andrea

    Read the article

  • HP SmartArray P212 with non HP disks, Insight Diagnostics error

    - by yonatan
    I have an ML110 G6 with a SmartArray P212 and two Seagate (non-HP) SAS disks in Raid 1. When I ran HP Insight Diagnostics I got some errors related to S.M.A.R.T. error testing and I would like to confirm that this is due to the controller not being able to query the drives as they are non-HP. I believe that the drives are not failing, but I want to be sure. Please have a look at these screenshots I took from the Insight Diagnostics report:

    Read the article

  • AppArmor Profile for PowerDNS

    - by Cory J
    I am currently working on a new authoritative nameserver using powerdns on Ubuntu 8.04 LTS. I'd like to have AppArmor protecting this service like it did with bind, but when I look in /etc/apparmor.d/, there was no AppArmor profile for this service installed by default. Any experienced pdns admins know what all files pdns accesses, so I can define a profile? Or better yet, does anyone HAVE a profile for pdns? Many thanks for any suggestions.

    Read the article

  • Storage device not found on ESX4 with AIC-9410

    - by Mads
    I am trying to install ESX 4.0 update 1 on a Supermicro X7DBR-3 system with an embedded AIC-9410 HBA (this HBA is listed on the HCG with Vendor ID 9005 and Device ID 041f) . All SATA controllers are disabled in the BIOS and the logical drive shows up in the Adaptec device summary during POST, however there is nothing listed on the Storage Device screen. The HBA itself is listed if I run esxcfg-info but not if I run esxcfg-scsidevs -a (under ESXi for that last command) Any ideas where I can look next or what might be wrong?

    Read the article

  • varnish demon error: libvarnish.so.1 not found

    - by Max
    In order to try out varnish for an upcoming project I installed it on an ubuntu server using this tutorial: http://varnish-cache.org/wiki/InstallationOnUbuntuDapper The build process worked without any errors, but I cant start the varnish demon. I always get the error message varnishd: error while loading shared libraries: libvarnish.so.1: cannot open shared object file: No such file or directory But /usr/local/lib/libvarnish.so.1 clearly exists. How can I tell varnish to look in that directory and load the library?

    Read the article

  • Driver Scanner detect driver needed

    - by Pennf0lio
    Hi, I'm currently fixing the laptop of my friend. Her Laptop is kinda old and is not a well know brand (Redfox Navigator), So that means there are lack of support online. Are there Software that you install and will scan the system and will look for the driver that it needs? Note: The Laptop can't connect through the internet. thanks!

    Read the article

  • Nginx: Can I cache a URL matching a pattern at a different URL?

    - by Josh French
    I have a site with some URLs that look like this: /prefix/ID, where /prefix is static and ID is unique. Using Nginx as a reverse proxy, I'd like to cache these pages at the /ID portion only, omitting the prefix. Can I configure Nginx so that a request for the original URL is cached at the shortened URL? I tried this (I'm omitting some irrelevant parts) but obviously it's not the correct solution: http { map $request_uri $page_id { default $request_uri; ~^/prefix/(?<id>.+)$ $id; } location / { proxy_cache_key $page_id } }

    Read the article

  • Web monitoring on SBS2003

    - by thestig
    Hi, Quick question, shouldn't Microsoft Small Business Server 2003 be able to report back on Web usage as well as email usage. I am currently getting a report back with all the email, flaws, memory usage but nothing on web usage. I have been given full responsibility for looking after my companies server but have never really done this before so i thought id look to the pro's. Any help would be greatly appreciated, Gerard

    Read the article

  • Dual NIC internet access

    - by JavaRocky
    Q1. If a computer had two NICs, lets say windows, on which interface would HTTP traffic (or any for that matter) go out on. Not sure how the routes table would look like. Q2. If one of the NICs link becomes unresponsive, would traffic be automatically routed to the other NIC? Thanks.

    Read the article

  • Best VPS based in UK?

    - by jimbo
    I am on the hunt for a new VPS supplier based in the UK, I was loving the look and service provided by Media Temple, and was going to go down that route but would prefer something held in the UK. Any suggestions more than welcome...

    Read the article

  • Dynamics CRM 4.0 Access Audit?

    - by Dan
    In Microsoft Dynamics CRM 4.0, is there any way to audit what records were viewed by a particular individual at what time without any special plugins? If you need a plugin, can you install the plugin and then look at past data?

    Read the article

  • Performance issues concurrently running MySQL and MS SQL Sever

    - by pacifika
    We're considering installing MySQL on the same database server that has been running MS SQL Server. From my research there are no technical issues running both concurrently, but I am worried that the performance will be affected. Is by default SQL Server set up to use all available memory for example? What should I look out for? Thanks

    Read the article

  • script to recursively check for and select dependencies

    - by rp.sullivan
    I have written a script that does this but it is one of my first scripts ever so i am sure there is a better way:) Let me know how you would go about doing this. I'm looking for a simple yet efficient way to do this. Here is some important background info: ( It might be a little confusing but hopefully by the end it will make sense. ) 1) This image shows the structure/location of the relevant dirs and files. 2) The packages.file located at ./config/default/config/packages is a space delimited file. field5 is the "package name" which i will call $a for explanations sake. field4 is the name of the dir containing the $a.dir i will call $b field1 shows if the package is selected or not, "X"(capital x) for selected and "O"(capital o as in orange) for not selected. Here is an example of what the packages.file might contain: ... X ---3------ 104.800 database gdbm 1.8.3 / base/library CROSS 0 O -1---5---- 105.000 base libiconv 1.13.1 / base/tool CROSS 0 X 01---5---- 105.000 base pkgconfig 0.25 / base/tool CROSS 0 X -1-3------ 105.000 base texinfo 4.13a / base/tool CROSS DIETLIBC 0 O -----5---- 105.000 develop duma 2_5_15 / base/development CROSS NOPARALLEL 0 O -----5---- 105.000 develop electricfence 2_4_13 / base/development CROSS 0 O -----5---- 105.000 develop gnupth 2.0.7 / extra/development CROSS NOPARALLEL FPIC-QUIRK 0 ... 3) For almost every package listed in the "packages.file" there is a corresponding ".cache file" The .cache file for package $a would be located at ./package/$b/$a/$a.cache The .cache files contain a list of dependencies for that particular package. Here is an example of one of the .cache files might look like. Note that the dependencies are field2 of lines containing "[DEP]" These dependencies are all names of packages in the "package.file" [TIMESTAMP] 1134178701 Sat Dec 10 02:38:21 2005 [BUILDTIME] 295 (9) [SIZE] 11.64 MB, 191 files [DEP] 00-dirtree [DEP] bash [DEP] binutils [DEP] bzip2 [DEP] cf [DEP] coreutils ... So with all that in mind... I'm looking for a shell script that: From within the "main dir" Looks at the ./config/default/config/packages file and finds the "selected" packages and reads the corresponding .cache Then compiles a list of dependencies that excludes the already selected packages Then selects the dependencies (by changing field1 to X) in the ./config/default/config/packages file and repeats until all the dependencies are met Note: The script will ultimately end up in the "scripts dir" and be called from the "main dir". If this is not clear let me know what need clarification. For those interested I'm playing around with T2 SDE. If you are into playing around with linux it might be worth taking a look.

    Read the article

  • Getting more helpful tab completion prompts in bash?

    - by Rory McCann
    Let's say I have a directory with a few files in it like this: $ ls file1 file2 file3 And I want to do some tab completion in bash: $ cat file<tab> file1 file2 file3 I remember seeing someone doing tab completion and the shell bolded the next parts, so in this case, it would bold the 1, 2 and 3 of the filename so it'll look like this: file**1** file**2** file**3** which will tell you what you should type in next. I think this was a feature of zsh, but is there any way to get it in bash?

    Read the article

< Previous Page | 151 152 153 154 155 156 157 158 159 160 161 162  | Next Page >