Search Results

Search found 9763 results on 391 pages for 'ys pro'.

Page 95/391 | < Previous Page | 91 92 93 94 95 96 97 98 99 100 101 102  | Next Page >

  • FTP file access problem

    - by Fahad Uddin
    I recently got a malware on my website. I have made the backup of the website on my computer and trying to wipe off my FTP. I am trying to delete the root folder but getting this error message on all of the malicious files, Response: 550 Could not delete index.php: Permission denied I am the sole administrator of the ftp so permission should not have been an issue. My host provider seems not to suffer from this problem as his websites are running well without any malware. I have also tried to change the root to 777 to see if the file permission change could help me delete the files but still I am getting the same error. Please help out. Thanks

    Read the article

  • Apache, Rewrite Rule and Directories

    - by milo5b
    my sites-available/ file looks something like the following: <VirtualHost *:80> ServerAdmin webmaster@mysite ServerName mysite.co.uk ServerAlias www.mysite.co.uk DocumentRoot /home/mysite.co.uk/htdocs/ <Directory /home/mysite.co.uk/htdocs/> Options -Indexes FollowSymlinks MultiViews AllowOverride All Order allow,deny allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/mysite.co.uk/error.log LogLevel warn CustomLog ${APACHE_LOG_DIR}/mysite.co.uk/access.log combined </VirtualHost> In .htaccess (in the htdocs/), I have (amongst others) the following rewrite rule: RewriteRule ^enquiries$ /enquiries.php Somehow I have also a directory named "enquiries" (/home/mysite.co.uk/htdocs/enquiries/), and when I hit the url "www.mysite.co.uk/enquiries" I get: HTTP/1.1 301 Moved Permanently Date: Mon, 10 Dec 2012 18:53:37 GMT Server: Apache/2.2.16 (Debian) Location: http://www.mysite.co.uk/enquiries/ Vary: Accept-Encoding Content-Type: text/html; charset=iso-8859-1 And a Browser would display the directory's content. Now, I could easily rename the folder and get it sorted, but I would like to understand what's going on here. What would be the correct way to configure Apache in a way that it wont behave this way, and instead would listen to the Rewrite Rule? If I did not explain myself clearly, please feel free to ask more questions, I'd be happy to answer them. Thanks!

    Read the article

  • Download Speed is 0.12 Mbps when tested with servers in U.S, but it is 0.76 Mbps when tested with local servers, normal? [closed]

    - by Graviton
    Feeling that my ISP is cheating my money ( My subscription package is 1 Mbps), I did a speed test on my internet connection using www.speedtest.net. I tried to test the connection speed on two servers, one local ( Malaysia), another in U.S. I found that while the upload speed remain constant, but the download speed is different; 0.76 Mbps for servers in Malaysia, and 0.12 Mbps for servers in U.S. I called the ISP, and they blamed it on the intercontinental signal lost. But how can it be that the speed differs by that much? If it really differs by that much than we should always take a grain of salt of what is advertised as the broadband speed because the advertised speed is not the speed we are getting. No?

    Read the article

  • Receiving requests where absolute URL on page are morphed to relative URLs

    - by Jacob
    In our web pages, we have a hyperlink with an href to an absolute URL: https://some.other.host.com/blah.aspx?var1=val1&var2=val2 For some reason, in our logs, we see a lot of requests to URLs of this format: http://our.site.com/https:/some.other.host.com/blah.aspx?var1=val1&var2=val2 We don't have any JavaScript that would request that URL; it only appears inside of a hyperlink. Is there some sort of known bot, browser plugin, bug, etc. that could be responsible for these requests being made?

    Read the article

  • DNS slows down on development environment

    - by Sequenzia
    I have a local development environment setup on my Mac. I am running an Ubuntu Web Server inside of a Virtual Box VM. I setup a host file on my Mac that points my dev site to the IP of the Ubuntu Virtual Server. Everything works good other than the fact a lot (not all) of the time it takes more than 5 seconds to load a page. I used firebug to track down where the problem is and when it's slow the DNS part of my request is taking over 5 seconds. Like I said it's not all the time. Sometimes it resolves and loads the page within milliseconds. The same page one click will be super fast and then the next time it takes over 5 seconds. It's really slowing me down and I am not sure what is causing it.

    Read the article

  • Why Facebook profiles are Google-searchable?

    - by Jose
    Facebook has around 1B user profiles. They can be found by searching in Google. However, I don't think these profiles are linked from anywhere, so how could Google discover them? As far as I know, sitemaps are not enough for that (http://webmasters.stackexchange.com/a/5151), as all URLs should be crawlable anyway. I ask the question as I also have a site with user profiles and would like to make them discoverable.

    Read the article

  • What's wrong with my htaccess ? (500 Error)

    - by Dany Khalife
    I've written a small htaccess file to redirect Internet Explorer users to a specific page Here are the contents : # MS Internet Explorer - Mozilla v4 RewriteEngine On RewriteCond %{HTTP_USER_AGENT} ^Mozilla/4(.*)MSIE RewriteRule ^index\.php$ /sorry.php [L] # All other browsers #RewriteRule ^index\.html$ /index.32.html [L] Any clue why this would give a 500 Internal Server Error ? I have used mod rewrite before so i have the module loaded there...

    Read the article

  • Huge difference between Facebook Ad Click figures and Apache log requests

    - by Gearóid
    We're running a facebook ad campaign for our business but there seems to be a huge discrepancy between the number of clicks registered and the number of requests made with "facebook.com" in the HTTP referrer. The difference can be anything between 40-80 clicks/requests. I understand why the Google Analytics would be off and I understand that the figures shouldnt be exactly the same but surely if 100 people click the ad then I should be seeing at least 90 requests for the homepage with facebook.com as the referrer? Can anybody provide any insight into why this may be happening?

    Read the article

  • protecting css selectors on large website

    - by Tim
    I have content that appears within a corporate website inside an iframe. Several departments contribute their own CSS files to manage the overall UI and design. My problem is that they may use selectors for elements like td (for instance), without notice. Of course that will affect my own content in the frame unless I add a class to every td. I'm just using td as an example: the generic style for any element could change without notice. Is there any method/convention/practice I can use to protect my own styling?

    Read the article

  • Using subdomains or directories for main categories?

    - by Matthieu
    I have a website which references places to travel to in the world. Those places are (of course) grouped by countries. Here is an example of an actual URL of my website: http://awespot.org/country/105/iceland I am wondering if it would be better, in terms of SEO, to have the countries separated in subdomains: http://iceland.awespot.org/ I know subdomains are considered by Google as different websites, so I am considering the 2 options: separating would mean also separating the pagerank and benefits of links to the website but separating would enable me to create a "web" or related websites (all related to travel and all) that link and benefit to each other I am only asking about SEO here, I know there are other questions raised by this possibility (user experience, administration... even password completion by web browsers)

    Read the article

  • How can I screen clients that try to register multiple times?

    - by Aba Dov
    My company offers a bonus to every client that register. We would like to prevent people from abusing this by registering several times. we thought about filtering clients by ip (there is a problem with workplaces where all stations have the same ip) cookies (if cookies are not allowed we might lose a client) I would like your opinions on these two methods and will be glad to hear about new ones. thanks

    Read the article

  • Are there any guidelines for laying out screen "real estate?"

    - by Corey
    I'm wondering if there is any information about creating a decent page layout so that your website will appeal to users of all resolutions. For example, the optimal width for pages. It seems like on my resolution, most websites have their content centered and covers about 80% of the page, which is easy on the eyes. Or maybe the height of the website's logo/header -- some sites I stumble upon have a huge logo with links or navigation under it, making it so that I need to scroll down to see the actual content, like articles or images (these sites don't keep me for very long). I understand that every user is different and may have browser extensions, page zoom or may be running some ancient system that displays in 640x480. I'm not looking for a "best" solution, but rather, some guidelines about designing to accommodate different resolutions. Basically, how can I make sure that I don't design a page where a paragraph might display in several easy-to-read lines on my resolution, but it turns into a single line on a 1920x1080 resolution and makes it hard for the user to follow?

    Read the article

  • What is the best approach to copy public dynamic pages?

    - by Renan
    Situation: the government is supposed to publish official information online such as acts and laws. Problem: they're using 90s expertise to do it. You can tell that by the constant use of deprecated html tags such as <table and the lack of any compression at all, which makes some documents go way over 700,000 bytes even though they're pure text. Side problem: some companies are actually editing and selling this content that should be public and free. What I need to know is the best approach to offer said official content in my own site for free. I've thought of setting up a mirror to copy the official pages from time to time, since some of them are updated frequently, which would automatically be compressed as all my pages are via htaccess.

    Read the article

  • Root Domain Redirects Incorrectly To Https instead of to WWW

    - by Ari
    TL;DR - Why do visits to my website homepage work without "www", but not to specific pages on it? I recently moved my website (Zappable.com) to a new webhost, RedHat's OpenShift (a PAAS). It requires using Cname records to setup custom domains, something my domain name registar (1&1) does not support without a hosting plan. So instead I setup Cloudflare in-between my domain and web host, and setup a Cname record on it. I then pointed a 1&1 "www" sub-domain to CloudFlare, and then pointed my 1&1 root to "www" sub-domain. This works fine for visiting to my homepage, but for some reason it does not work when visiting a specific page without "www". Instead of adding "www", it goes to HTTPS, which is strange.

    Read the article

  • Mod Rewrite - url rewriting

    - by modrewriteNewbie
    I am very new to mod rewrite. I need to redirect any user with "citzenhawk" parameter in their url to my url for example http://www.mywebsite.com/?sc=CX12N003&cm_mmc=affiliate--citizenhawk--nooffer-_-na&prfc=5&clickid=0004c845fa9a87050a4277221a003262 should result into http://www.mywebsite.com/ Here are my rewrite conditions: RewriteCond %{QUERY_STRING} (&|^)cm_mmc=(.)citizenhawk(.)(&|$)$ RewriteRule ^/rrs/ [NC,R=302,L] Where am I going wrong? Is my RewriteCond wrong?

    Read the article

  • Make my website dynamically loaded data available to Facebook Open Graph Object Scrapper

    - by fvaliquette
    Here is the design of my web site: The user enter myWebsite.com/a/1 .htaccess rules redirect to myWebsite.com/b Now the JavaScript ExtJS library is loading. Extracting the value from the URL (in this case it is “1”) Loading ./xml/1.xml From 1.xml setting the Open Graph data (Title, type, image, etc) Loading data that will be shown to the user from 1.xml into the website. My question is: How can I make the Open Graph data available to Facebook? Facebook do not to load my ExtJS JavaScript Library before extracting the Open Graph Object values from the HTML. Is there an easy solution to this problem? The only solutions I found is to make statics web pages or dynamically pages rendered on the server side but I would like to avoid these since my web page implementation is already finished and I would like to avoid re working on it.

    Read the article

  • Website restyle, SEO migration plan?

    - by Goboozo
    I am currently in a project for one of my biggest clients. We have built a website that will -replace- the old website. When it comes to actual content its is largely the same. However, the presentation of the content has changed drastically. From our point of view much more user-friendly (main reason to update the site). Now, since the sites presentation has changed we have some major changes in: HTML & CSS: To change the presentation of the content URL's: To make them better understandable (301 redirects have been taken care of and are in place) Breadcrumbs: To enhance the navigation (we have made the breadcrumbs match exactly with the url's) Pagination: This was added to enable content browsing Title tags: Added descriptive title tags to the major links and buttons. Basically all user content including meta tags have remained the same. Now since this company is rather successful and 90% of its clients come from Google's organic results I am obliged to take all necessary precautions. People tell me I need a migration plan to prevent the site being hurt in Google, but I have never worked using such a plan... ...So, based on the above. Would you consider a migration plan necessary and what precautions/actions would you recommend to prevent us being put down in our SERP positions? Many thanks in advance for your answers.

    Read the article

  • I need a webpage to host my javascript!

    - by Amir Reza
    Does anyone know a website that hosts javascripts on their page? I have a research project that needs to collect some RTT from all over the world and compare them together. I have written the javascript code for that but I do not have a high hit rate website to put it on to collect data. I know it is a little bit odd question to ask, but do you know any website or any trick that can help me? Note that the script would not do any harm to anybody! :-) Thanks, Decad is right, I basically need some people to put my script on their "high-hit rate" website ... so I can collect data from large number of clients... Of coarse, the script is run on the background with no harm to the page. It basically measures some RTT and submit it to a server. I already have some pages, but they barely got a hit from outside! Thanks,

    Read the article

  • When reversing a Google Analytics e-commerce transaction is the per-unit price positive or negative?

    - by Michael Glenn
    Google's own instructions for reversing an e-commerce transaction seem to contradict themselves regarding the unit price. In the instructions it states The item field has a positive per-unit price and a negative quantity. yet, the code sample has a negative per-unit price and negative quantity. _gaq.push(['_addItem', '1234', // order ID - necessary to associate item with transaction 'DD44', // SKU/code - required 'T-Shirt', // product name 'Olive Medium', // category or variation '-11.99', // unit price - required '-1' // quantity - required ]); Which is correct?

    Read the article

  • Is it better to have AWS EC2 and RDS is the same Availability Zone?

    - by Dan
    I run a web app in an AWS EC2 instance and the database for the app in an RDS instance both in Amazon Web Services Region East-1. However, one of them is in Availability Zone 1a and the other is in 1d. Am I getting all the speed benefits of having both instances in the same "data center" (East-1) even if they are in different Availability Zones, or can I optimize by moving them to the same Availability Zone?

    Read the article

  • My blog not even ranking for exact title match [on hold]

    - by Akshay Hallur
    I have original in detail blog posts related to blogging and SEO. This domain has been dropped (expired) 2 times before my acquisition. I am the 3rd owner of the domain since 143 days. Blog posts are not ranking even for exact titles. Google+ or LinkedIn shares will show up instead of my content.Some blog posts are not even indexed. I am hardly getting around 7 organic visits / day. Example 1 : http://www.infoflame.com/offer-pdf-of-blog-posts-for-likes-and-shares/    Title: Offer Readers PDF of Blog Posts for Their Likes and Shares not indexed at all.  Example 2 : http://www.infoflame.com/anchor-text-for-seo/    is indexed but not coming up for the exact title. Suspect: Dropped domain, less likely used for spam( WayBack machine (2 drops) 3 captures since 2004, I don't know whether there was Email spam) (But no manual actions in WMT, so no reconsideration request). What's the reason for this? Should I wait? How can I tell Google that ownership is changed and the domain is now spam-free? or should I de-index it and start a new blog? Thank you, for any advises.

    Read the article

  • Hosting a web application on discountasp.net using sql ce 5

    - by David Stanley
    I am hoping that someone may have experience with this, since the discountasp site is very lacking in straightforward answers. I am building a lightweight web application and have decided to have sql ce as the database for it. Two questions regarding this: Do i need to get an actual database hosted as well as the site, in order for it to work? Do you know if discountasp supports the use of sql ce (not with webmatrix or any cms builds, completely custom)? If they don't, do you have any experience/recommendations with getting this done?

    Read the article

  • is RapidSSL wildcard cert supported by major browsers?

    - by Jorre
    I'm thinking of buying a wildcard SSL cert from clickSSL : http://www.clickssl.com/rapidssl/rapidsslwildcard.aspx That would be a rapidssl certificate, and I was looking into my firefox options to see if RapidSSL is in the list of recognized Authorities. My certificate manager doesn't mention RapidSSL anywhere. Am I looking for the wrong name, e.g. is rapidssl recognized by browsers under a different name? I want to be sure that this certificate is working in all major browsers (including IE6)

    Read the article

  • Error when setting Piwik analytics

    - by bertran
    I've uploaded the latest version of Piwik unto my web server, which is hosted by go daddy.com, on a linux hosting plan. I'm setting it up (accessing it from my browser as instructed) and I have the "Piwikinstallation" page open on step 3 (database set-up ) of 9. I don't know what to imput in the field "database server"... the default is the number 127.0.0.1 When I leave that input as is, and click "Next" leaving the gives the error: "Error when trying to connect to database server: SQLSTATE[HY000] [2013] Lost connection to MySQL server at 'reading initial communication packet', system error: 111" and changing that input to "localhost" gives me another error: "Error when trying to connect to database server:SQLSTATE[HY000] [2002] Can't connect to local MySQL server through socket '/var/lib/mysql/mysql.sock' (2)"

    Read the article

  • Analytics Tracking and SEO

    - by Mahesh
    I'm using piwik on some of my websites and recently switched from google analytics. I find most of the stuff same on both analytics. But i always had this question in mind that what am i supposed to track other than these ? Bounce rate Referral sites Keywords Geolocation Periodic data(Month, year, week) for above factors Any other SEO factors to be considered while tracking with any analytics software ?

    Read the article

< Previous Page | 91 92 93 94 95 96 97 98 99 100 101 102  | Next Page >