Search Results

Search found 12857 results on 515 pages for 'spatial index'.

Page 318/515 | < Previous Page | 314 315 316 317 318 319 320 321 322 323 324 325  | Next Page >

  • How to set up Mac Os X like dragging behaviour on Linux

    - by ticking
    I would like to use Linux(Fedora) on a Aluminum MacBook Pro, but since the Touchpad on a MacBook Pro is only a giant button Apple does some custom tracking. When there is one finger placed, and a second follows (the click can occur before or after that) it will be interpreted as a drag. So the strong thumb can be used to hold pressure and the more accurate index finger can do the pointing. But Linux interprets this as a right click, since it only cares if two fingers are on the pad. Is there a way to achieve said behaviour? Cheers Jan

    Read the article

  • Rails with phusion passenger and wordpress

    - by Venu
    We had a site developed using on ruby on rails. It had Website Web services for mobile app Admin panel to manage data. We started using wordpress to manage site content. We have finished development, have to move to production now. This is the current virtual host code for wordpress to work under /wordpress URI. <Location /wordpress> PassengerEnabled off <IfModule mod_rewrite.c> RewriteEngine On RewriteBase /wordpress/ RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /wordpress/index.php [L] </IfModule> </Location> I want to make phusion passenger work for the /admin and /api URIs. And / to go to wordpress. Can we change the document root based on the URI? or any other better solution?

    Read the article

  • My website directories downloads instead of actually opening up from browser

    - by numerical25
    I added some screencast to show what I am having issues with http://screencast.com/t/212t3ANINqk http://screencast.com/t/bR44U1wkvNZl http://screencast.com/t/iDS7APYYsa but the page downloads my subdirectories instead of opening them up and displaying the index file of that page Here is the situation. I am trying to get my web service up using mac ports and I am just trying to configure all the files. I am using php, apache, etc. the page goes to the localhost root but anything beyond that. it can not find. edit Ive tried to add the following to httpd.conf within the <IfModule mime_module> but no hope AddType application/x-httpd-php .php AddType application/x-httpd-php .phtml AddType application/x-httpd-php .php3 AddType application/x-httpd-php .php4 AddType application/x-httpd-php .html AddType application/x-httpd-php-source .phps

    Read the article

  • Nginx + PHP-FPM, php scripts not running

    - by Gee
    I installed a LEMP stack on Linode using one of the stack scripts they offer. I actually didn't run it on boot but instead entered the commands manually since it seemed to not install everything correctly. Anyway, after installing everything and starting both the server and php-fpm without error, I created a phpinfo(); page on the default nginx location (/var/www/index.php). Problem is that it's not executing the script and instead displays as a static file. Anyone know how I could approach this?

    Read the article

  • Printer Brother DCP-110C Linux 64-bit drivers

    - by Ondra Žižka
    Hi, I need 64-bit Linux driver for DCP-110C (for Ubuntu 10.04 64-bit) I found only 32-bit here. http://welcome.solutions.brother.com/bsc/public_s/id/linux/en/index.html I've tried to follow those instructions. During the installation, I got this: ondra@ondra-doma:~/Downloads$ sudo dpkg -i --force-all dcp110clpr-1.0.2-1.i386.deb dpkg: warning: overriding problem because --force enabled: package architecture (i386) does not match system (amd64) (Reading database ... 257283 files and directories currently installed.) Preparing to replace dcp110clpr 1.0.2-1 (using dcp110clpr-1.0.2-1.i386.deb) ... Unpacking replacement dcp110clpr ... Setting up dcp110clpr (1.0.2-1) ... ln: creating symbolic link `/usr/lib/libbrcompij2.so.1.0': File exists ln: creating symbolic link `/usr/lib/libbrcompij2.so.1': File exists ln: creating symbolic link `/usr/lib/libbrcompij2.so': File exists After installation, the printer is listed at the cups server, but does not work (no command has any effect on printer (which is, of course, on and connected)). Anyone has found some working solution? Thanks, Ondra

    Read the article

  • mount dev, proc, sys in a chroot environment?

    - by Patrick
    I'm trying to create a Linux image with custom picked packages. I followed the guide here http://www.olpcnews.com/forum/index.php?topic=4766.0 However, when I tried to install some packages, it failed to configure due to missing the proc, sys, dev directories. So, I learned from other places that I need to "mount" the host proc, ... directories to my chroot environment. Though, I saw two syntax and am not sure which one to use. In host machine: mount --bind /proc <chroot dir>/proc and another syntax (in chroot envrionment): mount -t proc none /proc Which one should I use, and what are the difference? Edit: What I'm trying to do is to hand craft the packages I'm going to use on an XO laptop, because compiling packages takes really long time on the real XO hardware, if I can build all the packages I need and just flash the image to the XO, I can save time and space.

    Read the article

  • Web based interface for open SSL client certificates

    - by Felix
    Hi there! We are currently developing a apache2-based web application and want to invite some beta testers to give it a try. To be on the safe side, access should be provided by individual browser certificates (.p12) which are issued using a (fake) CA. Our users should be passing a complete register/login process and some of them will be granted administrative privileges within the application. That's why a preceding simple web-based authentication won't be sufficient. Atm, I am using a serverside shellscript to generate the certificates each time. Do you know about a small, web-based tool to simplify the process of generating / revoking those certificates? Maybe an overview of the CA's index.txt plus the option to revoke a cert and a link to download them directly?

    Read the article

  • Visual Studio 2008 crashes whenever I try to add a file to TFS

    - by Herms
    Wondering if anyone else has seen this or knows of a way to fix it. I have Visual Studio 2008 Pro with Team Explorer 2008 installed. Starting a couple of weeks ago any time I try to add a file to TFS (using the "Add Items to Folder" button in the Source Control Explorer window) VS crashes. I briefly see an "unhandled exception" dialog appear, but VS quits right after the dialog opens. I was able to see the exception at one point, and it looked like an index out of bounds exception trying to access some UI component list at -1 (can't remember the specifics, and it closes before I can bring it up now). I've tried uninstalling and reinstalling VS a couple of times already, along with resetting all of my settings. Neither helped.

    Read the article

  • phpbb behind a reverse proxy

    - by asciitaxi
    Hi, i've got a django app running on apache behind an nginx reverse proxy. Nginx takes requests on port 80 and forwards them to apache on 127.0.0.1:81. This works fine. Now I want to run phpbb on apache under /forums. My problem is that when phpbb does a redirect, it seems to redirect to the internal apache port, rather than port 80. So, for instance when I first go to http://my-dev-server/forums to configure php bb, it immediately redirects to http://127.0.0.1:81/forums/install/index.php. Is there something I need to do in nginx/apache/phpbb config to get it to redirect to the external port? Thanks very much!

    Read the article

  • Making python run on my webserver

    - by richzilla
    Hi all, im getting a bit stuck regarding options for running python scripts on my server. From the research ive done so far, i can see i need to modify apache slightly to run python scripts, by using either mod_wsgi or mod_python. Two issues i have: mod_python doesnt appear to be maintained anymore (last release, 2007) mod_wsgi appears to require modification of my httpd.conf file on a per application basis. What im wanting to know, is there a way of getting python scripts to run in the same way as php, i.e. just by going to index.py etc... or is it more involved than that? At present im just trying to set it up on my xampp install. Any help would be appreciated.

    Read the article

  • changing user in ubuntu

    - by Rahul Mehta
    Hi , this is my ls -all, the zfapi folder have the root right , how can i change this to www-data. Also Please advise what is the first root and secont root is ? Thanks drwxr-xr-x 4 www-data www-data 4096 2011-01-06 18:21 cdnapi -rw-r--r-- 1 www-data www-data 678 2010-08-30 12:02 config.js drwxr-xr-x 4 www-data www-data 4096 2010-11-23 15:55 css drwxr-xr-x 7 www-data www-data 4096 2010-11-17 13:12 images -rw-r--r-- 1 www-data www-data 25064 2010-12-17 18:26 index.html -rw-r--r-- 1 www-data www-data 19830 2010-12-18 11:24 init.js drwxr-xr-x 2 www-data www-data 4096 2010-12-02 12:34 lib -rw-r--r-- 1 www-data www-data 18758 2010-12-06 18:00 styles.css -rw-r--r-- 1 www-data www-data 1081 2010-10-21 17:56 testbganim.html drwxr-xr-x 2 www-data www-data 4096 2010-12-17 11:15 yapi drwxr-xr-x 7 root root 4096 2011-01-07 18:20 zfapi

    Read the article

  • How much HDD space would I need to cache the web while respecting robot.txts?

    - by Koning Baard XIV
    I want to experiment with creating a web crawler. I'll start with indexing a few medium sized website like Stack Overflow or Smashing Magazine. If it works, I'd like to start crawling the entire web. I'll respect robot.txts. I save all html, pdf, word, excel, powerpoint, keynote, etc... documents (not exes, dmgs etc, just documents) in a MySQL DB. Next to that, I'll have a second table containing all restults and descriptions, and a table with words and on what page to find those words (aka an index). How much HDD space do you think I need to save all the pages? Is it as low as 1 TB or is it about 10 TB, 20? Maybe 30? 1000? Thanks

    Read the article

  • I want to start my portfolio site using ASP.Net and I'm a bit lost about hwo to actually put it on t

    - by Papuccino1
    I found this site: www.discountasp.net They seem cheap enough and have a track record. I decided to host my site with them. Here's where I'm confused. I host the application (my website) with them and they give me an IP address, right? Users can visit my site by typing in that IP address right? (Of course once I move the index file and create a defauly web folder, etc.) Next step is buying a domain name right? Like www.mysite.com, right? Is this the way it's done, or am I doing it wrong?

    Read the article

  • *.example.com wildcard domain can be parsed from a single page?

    - by Sean Kean
    For a domain 'example.com' - what is the easiest way to set up a wildcard dns (*.example.com), hosting, and htaccess/httpd.conf/virtualhost, and script on a page so that: how.do.i.setup.a.site.with.wildcards.like.this.example.com or anything.that.is.given.as.a.subdomain.for.example.com is rendered by a page at example.com/index.html - yet keeps the wildcard subdomain in the URL bar and passes the full URL as a parameter for rendering tags in HTML? An example tag is a Facebook comment: { div class="fb-comments" data-href="http://how.do.i.setup.a.site.with.wildcards.like.this.example.com" data-num-posts="2" data-width="500" } I just opened a hosting account with spry.com and have a VPS running Ubuntu 11.04-x86-LAMP - Essentially, what is the most straightforward way of doing this? Thanks so much. (I originally posted this over on stackoverflow but realize its more of a serverfault question)

    Read the article

  • Odd Suhosin memory alerts

    - by slice
    I am getting a lot of odd suhosin alerts in my syslog. The following are example entries: Jun 9 08:46:11 suhosin[9764]: ALERT - script tried to increase memory_limit to 2145386496 bytes which is above the allowed value (attacker '157.55.39.180', file '/var/www/site/index.php') Jun 9 08:46:11 suhosin[9744]: ALERT - script tried to increase memory_limit to 2145386496 bytes which is above the allowed value (attacker '109.74.2.136', file '/var/www/site/test.php') Jun 9 08:46:13 suhosin[9779]: ALERT - script tried to increase memory_limit to 0 bytes which is above the allowed value (attacker 'REMOTE_ADDR not set', file 'unknown') Jun 9 08:46:13 suhosin[9779]: ALERT - script tried to increase memory_limit to 2145386496 bytes which is above the allowed value (attacker 'REMOTE_ADDR not set', file 'unknown') What is happening here? Why 0 bytes or 2145386496 bytes (2046 GB!!??)? Why does it sometimes state the attacker and the requested script and sometimes state 'REMOTE_ADDR not set' and file 'unknown'? How do I proceed to figure this out?

    Read the article

  • Using wget to recursively download whole FTP directories

    - by user9406
    I want to copy all of the files and folders from one host to another. The files on the old host sit at /var/www/html and I only have FTP access to that server, and I can't TAR all the files. Regular connection to the old host through FTP brings me to the /home/admin folder. I tried running the following command form my new server: wget -r ftp://username:[email protected] But all I get is a made up index.html file. What the right syntax for using wget recursively over FTP?

    Read the article

  • Clean URLS with mod rewrite and URL Encoded characters causes 404?

    - by Richard JP Le Guen
    I have a web site using mod_rewrite to get some clean urls and custom 404 pages. My .htaccess file looks like this: <IfModule mod_rewrite.c> RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-f RewriteRule ^(.*)$ index.php?clean_url=$1 [QSA,L] </IfModule> What puzzles me is that if the URL contains a %2F (url-encoded /) the server seems to force a 404. As an example, http://example.com/category/article would be a normal article, but then http://example.com/category%2farticle gives a server-generated 404 page. (not the custom 404 page) I wouldn't have expected this... why this is happening? Is there a way around it?

    Read the article

  • Nvidia 9 series or Intel HD 2000? [closed]

    - by EApubs
    I just tested an Nvidia 9300 GS card with a Intel Corei3 HD 2000 graphics system. Here is the windows experience index scores I got : Nvidia 9300 GS : Base Score 3.9 Processor : 7.1 Memory : 7.5 Graphics : 3.9 Gaming Graphics : 5.1 Hard Disk : 5.9 Intel HD 2000 : Base Score : 5.2 Processor : 7.1 Memory : 5.9 Graphics : 5.2 Gaming Graphics : 5.8 Hard Disk : 5.9 My questions are : When using Intel HD graphics, it reduces the score of my Ram! How is that possible? It checks the speed of the ram. Not the size (i think). Intel graphics take some of the ram space but how can that effect the speed? From both of them, what will be the good choice?

    Read the article

  • XAMPP: Access Forbidden!

    - by Yar
    I just installed a fresh XAMPP on OSX. Apache runs and I can see the splash page. I open the httpd.conf and I set both places that point to htdocs to someplace else, which results in Apache showing an "Access Forbidden!" message. I plugged my directory here: <Directory "/Applications/XAMPP/xamppfiles/htdocs"> and here: DocumentRoot "/Applications/XAMPP/xamppfiles/htdocs" Most files have permissions like -rw-r--r--, but even if I set the index.php using chmod 777 nothing changes. Strangely, I just did this whole thing with MAMP and had no problems serving that directory, but it was slow.

    Read the article

  • How to block a user in apache httpd server from accessing a *.php file inside a Directory, instead user should access this using Directory name

    - by Oxi
    My requirement looks Simple, But Googling Did not help me yet. my query is i want to Throw a 404 page to a user(Not Re-Direct to another folder or file), who is trying to Access *.php files in my website ex: when a client asks for www.example.com/home/ i want to show the content , but when user says www.example.com/home/index.php i want to show a 404 page. i tried different methods, nothing worked for me, one of which tried is shown below <Directory "C:/xampp/htdocs/*"> <FilesMatch "^\.php"> Order Deny,Allow Deny from all ErrorDocument 403 /test/404/ ErrorDocument 404 /test/404/ </FilesMatch> </Directory> Thanks in Advance

    Read the article

  • mod_rewrite RewriteRule is not working

    - by buggy1985
    Hi, This is a follow-up of this question: Rewrite URL - how to get the hostname and the path? And a copy of this: mod_rewrite RewriteRule is not working I got this Rewrite Rule: RewriteEngine On RewriteRule ^(http://[-A-Za-z0-9+&@#/%=~_|!:,.;]*)/([-A-Za-z0-9+&@#/%=~_|!:,.;]*)\?([A-Za-z0-9+&@#/%=~_|!:,.;]*)$ http://http://www.xmldomain.com/bla/$2?$3&rtype=xslt&xsl=$1/$2.xsl it seems to be correct, and exactly what I need. But it doesn't work on my server. I get a 404 page not found error. mod_rewrite is enabled, as the following simple rule is working fine: RewriteEngine On RewriteRule ^page/([^/\.]+)/?$ index.php?page=$1 [L] Can you help? Thanks

    Read the article

  • How do I remove a URL from Google without having to have a Google E-mail Account

    - by PP
    Really simple question. I do not want a Google account. I just want Google to stop making requests every 2 minutes for a URL it should never have known about (apparently Google harvests URLs from search requests as well as private e-mails, not just from actual web pages). But when I search Google help for removing URLs it appears I have to use their "webmaster tools" which require logging into a GMail account! How do I tell Google not to index my URL without becoming a customer? Note: I already return 404 for the URLs in question using a rewrite rule - this appears to make zero difference to the crawler which continually attempts to fetch the page every 2 minutes.

    Read the article

  • Switch 302 redirect to 301 with Apache 2 ProxyPass in front of Tomcat 6

    - by Gearóid
    I'm trying to optimise my site for SEO and it seems as though their is a 302 direct in action for the http requests. I'm hosting my app on a Tomcat 6 server which lies behind an Apache 2 server. I use the ProxyPass method (http://tomcat.apache.org/tomcat-6.0-doc/proxy-howto.html) to forward all requests to port 8080 (the port my app is hosted on). I've seen a lot of advice on how to set the redirect type when using the VirtualHost method but none to do with ProxyPass. The app is a Struts app that forwards users on to index.jsp when they hit the base url. Could this also be the issue? I'm grateful for any help on this one! Cheers!

    Read the article

  • How can I remove OLD history from Google Chrome?

    - by Norman Ramsey
    I'm working on a laptop with a modest hard drive, and 500MB is taken up with Google Chrome "History Index" and "Thumbnails" files. Some of these files are a year old. Chrome offers me the option to remove recent history, but I want the opposite: I want to remove old history. (Ideally I would remove the least recently used history information, but I don't expect to be able to do that.) Anyone have any ideas? I'm running the standard Debian google-chrome-beta package.

    Read the article

  • how to set auto redirection in tomcat

    - by Registered User
    I have a site http://social.openitup.in right now what you are seeing is a default Tomcat6 page. I am using mod_ajp as a front end and Apache vhost configuration for same is <VirtualHost *:80 > ServerName social.openitup.in ServerAdmin webmaster@localhost ProxyRequests off <Proxy *> Order deny,allow Allow from all </Proxy> ProxyPreserveHost On ProxyPass / ajp://192.168.1.19:8009/ ProxyPassReverse / ajp://192.168.1.19:8009/ </VirtualHost> How ever I have an application running on it http://social.openitup.in/olat what I want to do is when some one opens http://social.openitup.in then rather than seeing Tomcat6 home page from /var/lib/tomcat6/webapps/ROOT/index.html the person is redirected to olat application which is in /var/lib/tomcat6/webapps/olat how can this be achived? The above vhost configuration is on a machine separate than where OLAT is running.

    Read the article

< Previous Page | 314 315 316 317 318 319 320 321 322 323 324 325  | Next Page >