Search Results

Search found 17124 results on 685 pages for 'final cut pro'.

Page 104/685 | < Previous Page | 100 101 102 103 104 105 106 107 108 109 110 111  | Next Page >

  • Apache cannot find mysql database modules

    - by user809857
    I've created a simple django project and setup a mysql database. My simple project just creates an entry on the database. The project works fine when I use the built in development server provided by django (runserver) and it works well. But when I deployed the project on Apache and mod_Wsgi (Ubuntu server), django could not find 'books', which is in this case my table in the database. The mysql database that I use in runserver and apache are just the same. I also did rebuild the database using sqlall,validate and syncdb of django but still i get the error. What could be wrong with what I'm doing? Thanks

    Read the article

  • Will my current page layout get me penalized for duplicate content?

    - by Perry Roper
    I am using WordPress and in my post sidebar I have related posts which may be of interest to the user, however, I also have an excerpt of each article which is normally the first paragraph of the post it is linking to. For example: http://musicdune.com/reviews/album-review-ellie-goulding-lights If you do a Google Search for the first excerpt in the realted posts section from that page you get 4-5 results from my domain, http://www.google.co.uk/search?sourceid=chrome&ie=UTF-8&q=Strip+back+the+synths,+fast+beats+and+the+other+pop+elements,+and+you%E2%80%99re+left+with+something+elegant+and+soulful Is it recommended that I remove the excerpt from the related posts?

    Read the article

  • How do I point a new domain to start on a page that's not index.html on separate hosting?

    - by Owen Campbell-Moore
    I'm using a service (CMS/Host) called SquareSpace to host my site, and today I'm registering the domain for it. Basically, how do I make it so when somebody types www.tedxoxford.com it points at http://www.tedxoxford.com/landing (currently http://tedxoxford.squarespace.com/landing) instead of the default index? Is this possible? Squarespace is quite a restricted CMS and means that your logos etc all point to the index so I don't want people ending up on my landing/splash page every time they want the home page, only on the first time they type in the URL. A dirty hack would be to check the refferer and redirect anyone hitting the index to the landing page, but that's a lot of loading overhead I'd rather avoid...

    Read the article

  • Facebook - Filter Page posts by #hashtag [on hold]

    - by beppe9000
    I'm trying to gather all the posts (official posts and people's posts) on my page which contain a specified #tag, to later show them on website. But I've no clue on how to accomplish this. Is there any API capable of this or anything else that could help? I basically need to get all those posts IDs looping spidering trought them for my hashtag. I'm planning to do this server-side so PHP is my choice.

    Read the article

  • CSS variable height columns [migrated]

    - by Rob
    I have created a website, www.unionfamilies.com. I have a header, a main section consisting of two columns, and a footer. Currently, I have specified heights for header, main, footer, etc. I would like to make the site so the header stays on top, the columns adjust to match the height of the longest column, and the footer stays at the bottom. Can someone help me with this? I am new to CSS, so please be patient. Thank you.

    Read the article

  • Is it fine to put different category of stuff in a single domain name?

    - by Fahad Uddin
    I own a website which is regarding startups and finance. I am looking forward to work on Wordpress programming in which I would be selling wordpress themes. I thought of buying a domain name for Wordpress website but it takes quite lot of time to setup a website and then do its SEO. Is it fine(in terms of SEO and professionalism) to put the Wordpress category inside my old domain like, Domain: www.startupsandfinance.com Wordpress domain, www.wp.startupsandfinance.com

    Read the article

  • Making profit - Adsense contains too many stopwords

    - by Jack
    I was thinking of using Adsense, but after I've read about the stopwords policy... Too many words are banned: "a**, s**t, id**t, a****le, bu****it," etc.. That generally means that I cannot use Adsense, unless I edit my posts. How else would I go about making some profit out of my site? I don't want to use things like popups, text-link ads, I can't post many shoplinks, and my site is too small to sell adspace. For specific reasons, I also don't do videos, am not planning on starting a forum or premium content, or anything very close to what's in this sentence. The reason for this post is basically the fact that I've seen sites without any ads, huge sites, and I started to wonder: how do they make money? That was Gizmodo to be precise. Some info about my site: It's a blog where I review games and post news. There is no forum, no registration.

    Read the article

  • problem with Webmaster Google Sitemap

    - by Alex
    I have a wp mu 3.6.1 with domain mapping (0.5.4.3) with w3tc (0.9.3) and Google XML Sitemaps (4.0 BETA). I have 4 different sitemaps. sub-1.com/sitemap.xml sub-2.com/sitemap.xml sub-3.com/sitemap.xml sub-4.com/sitemap.xml on google webmaster i got 59 errors & 14 warnings. Sitemap errorsErrors: We encountered an error while trying to access your Sitemap. Please ensure your Sitemap follows our guidelines and can be accessed at the location you provided and then resubmit. General HTTP error: 404 not found Sitemap: sub-2.com/sitemap-pt-post-2011-02.xml etc (but when i click on my sitemap links they work fine) Sitemap errorsWarnings: URLs not accessible When we tested a sample of the URLs from your Sitemap, we found that some URLs were not accessible to Googlebot due to an HTTP status error. All accessible URLs will still be submitted. Sitemap: sub-2.com/sitemap-misc.xml HTTP Error: 404 URL: /sitemap.html (but when i click on my sitemap links they work fine) Sitemap errorsIndex errors URLs not accessible When we tested a sample of the URLs from your Sitemap, we found that some URLs were not accessible to Googlebot due to an HTTP status error. All accessible URLs will still be submitted. HTTP Error: 404 URL: /sitemap-pt-post-2010-09.xml (but when i click on my sitemap links they work fine) Web pages 3,276 Submitted 3,247 Indexed what do I have to put on network adminperformance(w3tc)page cachecache preloadSitemap URL: ? i have add "/sitemap.xml" my robots.txt: http://pastebin.com/3K2U0mQa my .htaccess: http://pastebin.com/efJJ6zwy How can I make it work right?

    Read the article

  • Ruby 1.9.3 segmentation fault on rackspace

    - by user531065
    I'm having a similar problem as "ruby 1.9 segmentation fault on exit." I followed the solution and updated libselinux with yum and have the following version installed: Package libselinux-2.0.96-6.fc14.1.x86_64 already installed and latest version I am running a RackSpace machine, kernel version: 2.6.35.4-rscloud. My C backtrace is: -- C level backtrace information ------------------------------------------- /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x18910a) [0x7f3cdb6c010a] vm_dump.c:796 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x5eb04) [0x7f3cdb595b04] error.c:258 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(rb_bug+0xb8) [0x7f3cdb5969d8] error.c:277 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x119865) [0x7f3cdb650865] signal.c:609 /lib64/libpthread.so.0() [0x328540eeb0] /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x179272) [0x7f3cdb6b0272] ./include/ruby/ruby.h:1306 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x17f31b) [0x7f3cdb6b631b] vm.c:1220 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x1800d5) [0x7f3cdb6b70d5] vm.c:624 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(rb_yield+0x44) [0x7f3cdb6bb274] vm.c:654 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0xab821) [0x7f3cdb5e2821] numeric.c:3204 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x183601) [0x7f3cdb6ba601] vm_insnhelper.c:404 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x1792b4) [0x7f3cdb6b02b4] insns.def:1015 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x17f31b) [0x7f3cdb6b631b] vm.c:1220 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x1800d5) [0x7f3cdb6b70d5] vm.c:624 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(rb_yield+0x44) [0x7f3cdb6bb274] vm.c:654 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(rb_ary_each+0x54) [0x7f3cdb5622e4] array.c:1478 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x183601) [0x7f3cdb6ba601] vm_insnhelper.c:404 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x1792b4) [0x7f3cdb6b02b4] insns.def:1015 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x17f31b) [0x7f3cdb6b631b] vm.c:1220 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(rb_iseq_eval+0x1f0) [0x7f3cdb6bbae0] vm.c:1447 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x67bfd) [0x7f3cdb59ebfd] load.c:310 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(rb_require_safe+0x6ef) [0x7f3cdb5a02df] load.c:619 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x183601) [0x7f3cdb6ba601] vm_insnhelper.c:404 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x1792b4) [0x7f3cdb6b02b4] insns.def:1015 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x17f31b) [0x7f3cdb6b631b] vm.c:1220 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x1800d5) [0x7f3cdb6b70d5] vm.c:624 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(rb_yield+0x44) [0x7f3cdb6bb274] vm.c:654 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(rb_ary_each+0x54) [0x7f3cdb5622e4] array.c:1478 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x183601) [0x7f3cdb6ba601] vm_insnhelper.c:404 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x1792b4) [0x7f3cdb6b02b4] insns.def:1015 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x17f31b) [0x7f3cdb6b631b] vm.c:1220 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(rb_iseq_eval_main+0xaf) [0x7f3cdb6bbbdf] vm.c:1461 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x6471a) [0x7f3cdb59b71a] eval.c:204 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(ruby_exec_node+0x1d) [0x7f3cdb59c56d] eval.c:251 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(ruby_run_node+0x1e) [0x7f3cdb59e60e] eval.c:244 ruby() [0x4008eb] /lib64/libc.so.6(__libc_start_main+0xfd) [0x3284c1ee5d] ruby() [0x4007d9]

    Read the article

  • Writing files in a sub folder of the web folder (apache security)

    - by Homunculus Reticulli
    I need to save session data for a dynamic web page script by writing to file. I have two questions: Are there any security preferences as to whether to save the data UNDER the web folder, or OUTSIDE the web folder? I attempted to write to the folder an (unsuprisingly), I had a 'file permission refused' type error. Should I set the folder ownership to the apache user (600, 640 or 644?) [[Edit]] core <- 'OUTSIDE' web folder (php script live here) data <- 'OUTSIDE' web folder (session data and other misc data resides here) web <- web root folder js <- any folder below is 'INSIDE' the web folder css html For example, in a php script (i.e. a dynamic PHP page), I can attempt to write to a file using something like fput('../data',data) yet (as I understand it) ../data should not be accessible - for security reasons. Could someone please provide a simple example that shows how to provide access to ../data/ in the example given above?. What are the actual SPECIFIC steps required? BTW, I am running on a LAMP stack.

    Read the article

  • Why deny access to website for msnbot/bingbot?

    - by Quandary
    I've seen quite a lot of tutorials that recommend you to ban user agents containing the strings libwww-perl and msnbot. I understand why one would ban libwww-perl, it's mainly if not only used for hacking and spamming. But why are there so many sites recommending to ban msnbot/bingbot? Since it's a search engine, even if only with a marginal market share, I would except one would want this bot to crawl one's sites. What is it that msnbot does that makes people ban it?

    Read the article

  • Url rewrite subfolder to root and forbid accessing subfolder

    - by Alessandro Pezzato
    I have drupal installed in a subfolder drupal, but I want to access pages as it is in root folder: http://www.example.com instead of http://www.example.com/drupal I'm able to have this working, but it's also working with url containing subfolder, so I have http://www.example.com and a clone site in http://www.example.com/drupal What is the rule to forbid access to subfolder? I want all url starting with http://www.example.com/drupal being forbidden. This is .htaccess in / directory: Options -Indexes Options +FollowSymLinks <IfModule mod_rewrite.c> RewriteEngine on RewriteCond %{HTTP_HOST} ^www\.(.+)$ [NC] RewriteRule ^ http://%1%{REQUEST_URI} [L,R=301] RewriteRule ^(.*+)$ drupal/$1 [L,QSA] </IfModule> And this is drupal .htaccess in /drupal/ directory: Options -Indexes Options +FollowSymLinks ErrorDocument 404 index.php DirectoryIndex index.php index.html index.htm # Override PHP settings that cannot be changed at runtime. See # sites/default/default.settings.php and drupal_initialize_variables() in # includes/bootstrap.inc for settings that can be changed at runtime. # PHP 5, Apache 1 and 2. <IfModule mod_php5.c> php_flag magic_quotes_gpc off php_flag magic_quotes_sybase off php_flag register_globals off php_flag session.auto_start off php_value mbstring.http_input pass php_value mbstring.http_output pass php_flag mbstring.encoding_translation off </IfModule> # Requires mod_expires to be enabled. <IfModule mod_expires.c> # Enable expirations. ExpiresActive On # Cache all files for 2 weeks after access (A). ExpiresDefault A1209600 <FilesMatch \.php$> # Do not allow PHP scripts to be cached unless they explicitly send cache # headers themselves. Otherwise all scripts would have to overwrite the # headers set by mod_expires if they want another caching behavior. This may # fail if an error occurs early in the bootstrap process, and it may cause # problems if a non-Drupal PHP file is installed in a subdirectory. ExpiresActive Off </FilesMatch> </IfModule> # Various rewrite rules. <IfModule mod_rewrite.c> RewriteEngine on # Block access to "hidden" directories whose names begin with a period. This # includes directories used by version control systems such as Subversion or # Git to store control files. Files whose names begin with a period, as well # as the control files used by CVS, are protected by the FilesMatch directive # above. RewriteRule "(^|/)\." - [F] # To redirect all users to access the site WITH the 'www.' prefix, # (http://example.com/... will be redirected to http://www.example.com/...) # uncomment the following: # RewriteCond %{HTTP_HOST} !^www\. [NC] # RewriteRule ^ http://www.%{HTTP_HOST}%{REQUEST_URI} [L,R=301] # # To redirect all users to access the site WITHOUT the 'www.' prefix, # (http://www.example.com/... will be redirected to http://example.com/...) # uncomment the following: RewriteCond %{HTTP_HOST} ^www\.(.+)$ [NC] RewriteRule ^ http://%1%{REQUEST_URI} [L,R=301] RewriteBase /drupal # Pass all requests not referring directly to files in the filesystem to # index.php. Clean URLs are handled in drupal_environment_initialize(). RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_URI} !=/favicon.ico #RewriteRule ^ index.php [L] RewriteRule ^(.*)$ index.php?q=$1 [L,QSA] # Rules to correctly serve gzip compressed CSS and JS files. # Requires both mod_rewrite and mod_headers to be enabled. <IfModule mod_headers.c> # Serve gzip compressed CSS files if they exist and the client accepts gzip. RewriteCond %{HTTP:Accept-encoding} gzip RewriteCond %{REQUEST_FILENAME}\.gz -s RewriteRule ^(.*)\.css $1\.css\.gz [QSA] # Serve gzip compressed JS files if they exist and the client accepts gzip. RewriteCond %{HTTP:Accept-encoding} gzip RewriteCond %{REQUEST_FILENAME}\.gz -s RewriteRule ^(.*)\.js $1\.js\.gz [QSA] # Serve correct content types, and prevent mod_deflate double gzip. RewriteRule \.css\.gz$ - [T=text/css,E=no-gzip:1] RewriteRule \.js\.gz$ - [T=text/javascript,E=no-gzip:1] <FilesMatch "(\.js\.gz|\.css\.gz)$"> # Serve correct encoding type. Header append Content-Encoding gzip # Force proxies to cache gzipped & non-gzipped css/js files separately. Header append Vary Accept-Encoding </FilesMatch> </IfModule> </IfModule>

    Read the article

  • How to force browsers to always reload xslt files?

    - by bitmask
    Related: Apache: How can I force the browser to reload CSS files? I'm building an xml page (on an apache2) that is supposed to be translated to xhtml by the browser, so my server also serves a main.xslt which is used as stylesheet by the xml file, similar to the scenario with the css files in the linked question. However, none of tricks provided in either that answer, nor some issues on SO solve the issue for Opera. While Firefox responds to F5 by fetching not only the xml file but also the xslt file, Opera only reloads the xml file. I tried both, setting the Last-Modified HTTP header via an .htaccess file and using the expires module of apache2. This is what my .htaccess looks right now: AddType text/xsl;charset=utf-8 .xslt ExpiresByType text/xsl "modification plus 1 second" Header set Last-Modified "Wed, 08 Jan 2000 23:11:55 GMT" #Header set Last-Modified "Wed, 08 Jan 2020 23:11:55 GMT" If I open the xsl myself and manually reload it, the xml presentation is updated as well, but this is tedious for development. Note: There is no php or any kind of scripting involved. Everything is static.

    Read the article

  • Referral traffic not appearing properly in Google Analytics

    - by Crashalot
    We have a partnership arrangement with another site where we pay them for users sent to us. However, they claim our referral numbers for them are lower than theirs by 50%. They are tracking clicks in Google Analytics (using events) while we are using visits in Google Analytics. Are we doing something wrong with our Google Analytics installation? <!-- Google Analytics BEGIN --> <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-12345678-1']); _gaq.push(['_setDomainName', 'example.com']); _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> <!-- Google Analytics END -->

    Read the article

  • apache domain redirect to subfolder

    - by Dennis
    I have a hosting account with godaddy. Its a linux system running apache. The way they do their setup is your primary domain is the root folder. When you add a subdomain its in a subfolder of the root which sucks. I want to setup a subfolder structure to organize my domains.. I called godday support and they said to use redirects.. but did not know how to do that.. How its setup now: primary domain: www.domain.com / sub.domain.com /sub I want to create a directory structure and then redirect to each but only show www.domain.com in the url www.domain.com /domain/www sub.domain.com /domain/sub I tried using: RewriteEngine On RewriteCond %{HTTP_HOST} ^(www.)?domain.com$ RewriteRule ^(/)?$ domain/www [L] but it just changes the url to www.domain.com/domain/www Can this be done in htaccess?

    Read the article

  • cPanel's Web Disk - Security issues?

    - by Tim Sparks
    I'm thinking of using Web Disk (built into the later versions of cPanel) to allow a Windows or Mac computer to map a network drive that is actually a folder on our website (above the public_html folder). We currently use an antiquated local server to store information, but it is only accessible from one location - we would like to be able to access it from other locations as well. I understand that folders above public_html are not accessible via http, but I want to know how secure is the access to these folders as a network drive? There is potentially sensitive information that we need to decide whether it is appropriate to store here. The map network drive option seems to work well as it behaves as if the files are on your own computer (i.e. you can open and save files without then having to upload them - as it happens automatically). We have used Dropbox for similar purposes, but space is a issue with them, as is accountability and so we haven't used it for sensitive information. Are there any notable security concerns with using Web Disk as a secure file server?

    Read the article

  • Huge google impression drop after cleaning html

    - by olgatorresfoundation
    Good morning, I am the webmaster of a non-profit organization that donates grants to colorectal cancer research projects and funds various colorectal cancer information campaigns. We have three domains: www.fundacioolgatorres dot org (Catalan) www.fundacionolgatorres dot org (Spanish) www.olgatorresfoundation dot org (English) So what happened? I redesigned olgatorresfoundation on the 20th and the fundacionolgatorres on the 30th of May. In both cases, exactly two days later, the number of impressions on both dropped to a halt. Granted, we did not have the traffic of Microsoft, but a 90% decrease a disaster of incredible proportions for us. My only real changes were cleaning up the old ineffective HTML to a cleaner form (mostly moving away from redundant table construction to a table-less view). Here is a before and after snapshot of what the change looks like: Before: http://www.fundacioolgatorres.org/aparell_digestiu/introduccio/ (unchanged page in Catalan) After: http://www.olgatorresfoundation.org/digestive_system/introduction/ (changed page in English) Anybody has a clue to what just happened? Why should a normal, sane html improvement be punished and so dramatically? No URLs have been changed, neither have page names or descriptions. Possible secondary question: If it is so that Google sees it as a major overhaul and decides to drop the pagerank sharply, does it come back to pre-change levels if the content "checks out" or will the page start over from scratch earning those pagerank points (which would mean that we would have to wait 6 months for the pages to recover to the level they had two weeks ago)? (duplicated from productforums.google dot com/forum/#!category-topic/webmasters/crawling-indexing--ranking/YsnyX0JzOpY, hoping to reach a wider audience)

    Read the article

  • Sending Emails via Google SMTP - after some time quit working

    - by Chris
    on a website I use PHPMailer to send automated registration emails, etc and also a newsletter-tool (which loops through the emails and sends them one by one). Also, I configured in Gmail under Settings and confirmed @mydomain addresses, so I can send from @mydomain emails without the gmail address being displayed. Furthermore I authorized the website to send mails with this link: https://accounts.google.com/DisplayUnlockCaptcha Now, after 2 month where everything worked perfectly fine, suddenly users started not to receive emails anymore and most recently emails are not even being sent anymore. Also, I received many error messages like this: Technical details of permanent failure: Google tried to deliver your message, but it was rejected by the recipient domain. We recommend contacting the other email provider for further information about the cause of this error. The error that the other server returned was: 550 550 5.4.1 [email protected]: Recipient address rejected: Access Denied (state 13). When I check at this link: https://toolbox.googleapps.com/apps/checkmx/ It tells 2 none critical errors: Relayhost configuration detected. There SHOULD be a valid SPF record. So, the questions I would have were: does anybody have any hint why it stopped working, what the error messages mean? what to do to fix it? where do I set a SPF record (Cpanel?)? what is a relayhost and how to fix that? It is about 1000-1400 mails a day (gmail's limit is 2000). Also, what can I do wrong when setting up an SPF record? I've heard there are some testing tools for that. Thank you so much already in advance for your help!

    Read the article

  • Weird .ASP pages from my non-ASP site generating 404s

    - by Amanda
    In Google Webmaster Tools, I have a huge list of "Not Found" crawl errors (404) with URLs that look like this: http://www.exclusivevillas.co.za/villa_view.asp?vSeq=82&activitySeq=3&page=3, seemingly originating from URLs very similar to that (eg http://www.exclusivevillas.co.za/villa_view.asp?vSeq=82&activitySeq=3&page=4. Thing is, the site is WordPress. Has been for almost a year now. Was plain html before that. I don't know where these ASP requests are coming from. And furthermore, the dates these supposed ASP pages requested these other ASP pages, resulting in 404s, are very recent. What's going on?

    Read the article

  • Format numbers with css

    - by Luc M
    Is it possible to format numbers using css ? When I have 7000000.00, I would like it displayed as 7 000 000.00 I know I could write a backend (php, perl...) function or a javascript function that could return the formatted number but... The numbers that I want to format are into a cell. I would like to have something like <td class="myformat">7000000.00</td> or <td><span class="myformat">7000000.00<span></td>

    Read the article

  • Transfer websites and domains to new server

    - by Albert
    We have currently around 40 websites and 80+ domains/sub-domains in a shared 1&1 hosting package, and we just acquired a managed dedicated server with 1&1 as well. Now it's time to start transferring everything over to the new server. Transferring just the websites and databases wouldn't be a problem, it would take time but it's pretty straight forward. The problem comes when transferring the domains, let me explain why. Many of the websites we have are accessible via sub-domains of a parent domain. Ideally, we would like to transfer the sites one by one, in order to check for each one that everything works fine in the new server. However, since we also need to transfer the domain so it's managed in the new server, once we do that means that all the websites using that domain need to be already in the new server before transferring that domain, thus not allowing the "one by one" philosophy. Another issue is the downtime when transferring the domain, from the moment it stops working in the hosting package and becomes active in the new server. I believe there's nothing we can do here. So my question is if there's any way we can do the "one by one" transferring of the websites (and their corresponding sub-domains) in the circumstances described above. One idea I had would be: 1. Let's say we have website A, which is accessible using subdomain.mydomain.com (and there are many other websites accessible via other sub-domains of mydomain.com) 2. Transfer the files of website A to the new server 3. Point a test domain in the new server to the website A's folder (the new server comes with a "test" domain) 4. Test if website A works with that "test" domain 5. In the old hosting, somehow point the real sub-domain (subdomain.mydomain.com) to the new location of website A, in a way that user always see the same URL as always 6. Repeat 2-5 for every website belonging to the same domain 7. Once all are working in the new server, do the actual transfer of the domain to the new server, and then re-create all the sub-domains and point them to their corresponding website That way, users wouldn't notice that there's been a change (except for a small down time of the websites when doing the domain transfer). The part I'm not sure about is point 5 of the above. Is there any way to do that? I mean do it in a way that users see the original domain all the time in their browser, even for internal pages (so not only for the "home page", which would be sub-domain.mydomain.com, but also for example for the contact page, which would be sub-domain.mydomain.com/contact.php). Is there any way to do this? Or are we SOL and we're going to have to transfer all at the same time?

    Read the article

  • Showing schema.org aggregate rating in Google rich snippets

    - by nickh
    After adding schema.org microdata markup for reviews and aggregate ratings, I expected review and rating information to show up in rich snippets. Unfortunately, neither are being shown. Google's Structured Data Testing Tool finds the microdata, and there're no errors or warnings on the page. Any idea what's wrong with the microdata markup? Example 1: Live Page: http://www.shelflife.net/ljn-thundercats-series-3/bengali Google Test: http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.shelflife.net%2Fljn-thundercats-series-3%2Fbengali Example 2: Live Page: http://www.shelflife.net/star-wars-mighty-muggs/asajj-ventress Google Test: http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.shelflife.net%2Fstar-wars-mighty-muggs%2Fasajj-ventress

    Read the article

  • Interpretation of empty User-agent

    - by Amit Agrawal
    How should I interpret a empty User-agent? I have some custom analytics code and that code has to analyze only human traffic. I have got a working list of User-agents denoting human traffic, and bot traffic, but the empty User-agent is proving to be problematic. And I am getting lots of traffic with empty user agent - 10%. Additionally - I have crafted the human traffic versus bot traffic user agent list by analyzing my current logs. As such I might be missing a lot of entries in there. Is there a well maintained list of user agents denoting bot traffic, OR the inverse a list of user agents denoting human traffic?

    Read the article

  • dynamic urls and links on one web page

    - by John
    I am trying to figure out how to create dynamic links and urls on a static webpage. What I want to do is the following: I have a single webpage for example: MYWEBPAGEdotCOM/INDEX.HTML that will always look the same, except for one link on the page. the link would be on the page for example: LINK TO AFFILIATE: affiliatedotCOM/my-affiliate_code_here_DYNAMIC_REFERER the only thing would change is the "DYNAMIC_REFERER" with every dynamic url on this page: MYWEBPAGEdotCOM/INDEX.PHP_id=test1 MYWEBPAGEdotCOM/INDEX.PHP_id=test2 MYWEBPAGEdotCOM/INDEX.PHP_id=test3 MYWEBPAGEdotCOM/INDEX.PHP_id=test4 which would only hange the dynamic link on the page to: affiliatedotCOM/my-affiliate_code_here_test1 affiliatedotCOM/my-affiliate_code_here_test2 affiliatedotCOM/my-affiliate_code_here_test3 affiliatedotCOM/my-affiliate_code_here_test4 Can someone tell me how I could go about doing this? I just dont want to have to make 100's of pages, as this would prevent me from having to do so.

    Read the article

  • Apache config file. Redirect permanent gives 403 error

    - by Homunculus Reticulli
    I am changing my domain from foo.com to foobar.org. I used a Redirect permanent in my apache config file, and then restarted apache. When I try to access the old domain foo.com, I get a 403 error. This is what my apache config file looks like: <VirtualHost *:80> ServerName foo.com #ServerAlias www.foo.com #ServerAdmin [email protected] Redirect permanent / http://www.foobar.org/ DocumentRoot /path/to/project/foo/web DirectoryIndex index.php # CustomLog with format nickname LogFormat "%h %l %u %t \"%r\" %>s %b" common CustomLog "|/usr/bin/cronolog /var/log/apache2/%Y%m.foo.access.log" common LogLevel notice ErrorLog "|/usr/bin/cronolog /var/log/apache2/%Y%m.foo.errors.log" <Directory /> Order Deny,Allow Deny from all </Directory> <Files ~ "^\.ht"> Order allow,deny Deny from all </Files> <Directory /path/to/project/foo/web> Options -Indexes -Includes AllowOverride All Allow from All RewriteEngine On # We check if the .html version is here (cacheing) RewriteRule ^$ index.html [QSA] RewriteRule ^([^.])$ $1.html [QSA] RewriteCond %{REQUEST_FILENAME} !-f # No, so we redirect to our front end controller RewriteRule ^(.*)$ index.php [QSA,L] </Directory> <Directory /path/to/project/foo/web/uploads> Options -ExecCGI -FollowSymLinks -Indexes -Includes AllowOverride None php_flag engine off </Directory> Alias /sf /lib/vendor/symfony/symfony-1.3.8/data/web/sf <Directory /lib/vendor/symfony/symfony-1.3.8/data/web/sf> # Alias /sf /lib/vendor/symfony/symfony-1.4.19/data/web/sf # <Directory /lib/vendor/symfony/symfony-1.4.19/data/web/sf> Options -Indexes -Includes AllowOverride All Allow from All </Directory> </VirtualHost> Can anyone spot what I may be doing wrong?. The site foobar.org does exist so I don't know why this error occurs - help?

    Read the article

< Previous Page | 100 101 102 103 104 105 106 107 108 109 110 111  | Next Page >