Search Results

Search found 14789 results on 592 pages for 'pro backup'.

Page 229/592 | < Previous Page | 225 226 227 228 229 230 231 232 233 234 235 236  | Next Page >

  • .htaccess url rewriting problem

    - by letsworktogether
    I'm kind of stuck at this part and was hoping that I'd get some assistance. I'm building a highscores page in PHP, that's going great, it works. however, I dislike the idea of "index.php?skill=name" and therefore wanted a bit of SEO in this. I have successfully replaced the url with a more friendly version: "highscores/skill/name" And this is where the problem starts, I have added pagination to the highscores and the page is read from the HTTP_GET page variable ($_GET['page']). I dislike the idea of "highscores/skill/name&page=2" and was hoping if you guys could assist me to make the url like the following: Page 1, so accessing the file without declaring the page number: DOMAIN.TLD/highscores/skill/name Page 1 so now the page variable is needed:DOMAIN.TLD/highscores/skill/name/2 As you can tell the "2" will define page 2 and load the correct data for page 2. However, I'm having much trouble in my .htaccess file to configure it this way. RewriteRule ^highscores\/skill\/(.*?)(\/(.*?)*)$ highscores/skills.php?skill=$1&page=$2 [L] # Skills page That is my latest attempt in order to get it to work, unfortunately it does not work, it makes the page look horrible (CSS doesn't work) and it doesn't go to the page specified on the URL. I hope you understand my issue, thank you!

    Read the article

  • Non-intentional hosting of material that is protected by copyright law

    - by spacemonkey
    I am interested how copyright law works in the case of Rapidshare, Youtube and etc. Just hypothetically speaking, what if I create a website for uploading and sharing MP3 files, and some users start uploading songs in MP3 format that are protected by copyright laws. Can I get sued for this? Knowing that it wasn't me who uploaded that content? Thanks! PS. also maybe there is some good source where I could read about law cavities related to copyright material?

    Read the article

  • Yandex frequently replaces page names with ampersands

    - by Guy
    The Yandex spider is a frequent visitor to one of the sites I manage. On ocassion it replaces the page name with two ampersands and a space. So if the page is: /mypage.aspx?param=value then it will try and crawl it as: /&& ?param=value Any idea why it is doing this? [EDIT] If I remember correctly the IP that this "mistake" is coming from is based in California and not Russia. I believe that they crawl US sites from a US based IP address. Not sure if that helps. More Info about request: IP: 199.21.99.82 City: Palo Alto State: California Country: United States ISP: Yandex Inc. User-Agent: Mozilla/5.0 (compatible; YandexBot/3.0; +http://yandex.com/bots)

    Read the article

  • Installing MoinMoin -wiki to my user directory on a server with no root access

    - by deiga
    Hello all, I've been trying to install MoinMoin -wiki on this webserver, where I have no root access. The server doesn't support wsgi, but it does support cgi/fcgi/etc. I've scoured google for a simple guide on how to accomplish this, but the only guides I found were from the year 2004 or so. Other guides always assumed that one has root access. So can anyone link a good tutorial for my question or just help me out here? Your help is appreciated :) P.S. Sorry if this is the wrong stack -page

    Read the article

  • Is there an app/script I can deploy to enable my users to change their own LDAP passwords?

    - by Tom Wright
    I've recently enabled LDAP based authentication on my domain. This has allowed us to use a single set of credentials to administer the blog, the forum and the wiki. Unfortunately, this has come at the cost of users being able to change their own passwords. Ideally, users would be able to visit a page (i.e. mydomain.com/account), authenticate and then change their password. Does anyone know of a script or app that will allow me to do this quickly and easily? I guess it wouldn't be hard to write in PHP, but I'd prefer not to have the hassle.

    Read the article

  • Slashdotted web site seeks new home

    - by Arthur Edelstein
    I am maintaining a website that contains mostly simple html (just a little php). Normally the site receives only 4000 hits per month, but it was recently slashdotted by the New York Times (30,000 visitors and 30 GB in a day) and the web host provider (bluehost) throttled the CPU in response. This slowed down the website considerably. What web host providers would offer a more scalable solution? Ideally I would like a high-quality host that charges by the GB and can handle bandwidth to expand during sudden slashdotting episodes without a reduction in performance.

    Read the article

  • Simple CMS for clients?

    - by jae
    I'm looking for something fairly simple for clients to use and very easy for me to skin (somewhat complicated themes). something like Perch or Couchcms, but has to be open source to satisfy some license requirements.. I can't really seem to find any CMS that is like this {Wordpress, drupal etc are way too heavy for this use} , really appreciated There aren't much requirements, anything will really work - each site gets their own jailed instance and stuff, just PHP/MySQL

    Read the article

  • How to open the JavaScript console in different browsers?

    - by Šime Vidas
    Chrome: Press CTRL + SHIFT + I to open the Developer Tools. Click on the "Open console." icon in the bottom left corner. Safari: Press CTRL + ALT + I to display the Web Inspector. Click on the "Open Console." icon in the bottom left corner. Note: this only works if the "Show Develop menu in menu bar" check box in the Advanced tab of the Preferences menu is checked! IE9: Press F12 to open the developer tools. Open the Script tab, click the "Console" button on the right. Firefox 4: Press CTRL + SHIFT + K to open the Web console. What about Opera 11? Clarification: By console I mean the JavaScript console that lets you input and execute JavaScript code.

    Read the article

  • <meta name="robots" content="noindex"> in "Fetch as Google"

    - by Rodrigo Azevedo
    I don't know why but when I execute "fetch as Google" it returns me HTTP/1.1 200 OK Cache-Control: private Content-Type: text/html Content-Encoding: gzip Vary: Accept-Encoding Server: Microsoft-IIS/7.5 Set-Cookie: ASPSESSIONIDQACRADAQ=ECAINNFBMGNDEPAEBKBLOBOP; path=/ X-Powered-By: ASP.NET Date: Wed, 26 Jun 2013 15:18:29 GMT Content-Length: 153 <meta name="robots" content="noindex"> The noindex doesn't exist. Does anybody know what could be wrong?

    Read the article

  • Link tags in iframe widget

    - by john Smith
    I have a rating community-site and I´m offering little iframe widgets with the average rating and some little other info. Does it make sense (for visibility, SEO) to add link tags to the head like: <link rel="alternate" type="application/rss+xml" title="RSS 2.0" href="rssfeed" /> <link rel="index" title="main-profile" href="main-profile"> To get a logical association of the widget to relating pages? How would you do this?

    Read the article

  • Redirecting non existing post to homepage; is that good for SEO?

    - by BlackEagle
    I am checking my website out on Google Webmasters and I am seeing an astonishing 5000 links that could not be found by Google's Crawlers. That's normal, because my website is built in a manner that users can drop their own things, which also lead to 404 pages. Not a problem at all if I can find a workaround of course... So my question is: what if I made a function or a mod rewrite that will check if the link exists (a post for example) and if not, it will redirect it to the home page. Is this good for SEO? Will Google see this as 'link found'? How do I have to look at this problem?

    Read the article

  • 302 Moved Temporarily or 301?

    - by user11221
    I have a question on redirects. HTTP status code checker tool shows "HTTP/1.1 302 Moved Temporarily" for the home page url http://someurl.com (just a namesake url). Also, this url opens up http://www.someurl.com/general/index. As you can see, a non-www url to a www url redirect is happening. My questions are: Is a 302 redirect acceptable for the home page? Will this affect the site showing up in search results in anyway? Isnt redirection to /general/index a bad practice?

    Read the article

  • phpBB3 antispam: mod for "patrolling" the forum?

    - by STATUS_ACCESS_DENIED
    I've been working on various antispam measure in a phpBB3-based forum I host. Now I was thinking of an extension/mod that ties in with editing of posts (and later perhaps signatures/profiles) in that new text or edited text defaults to something like "not patrolled" and moderators could then in a special queue review text that contains links or similar item (based on heuristics). Now the question: does such a mod exist (I didn't find one)? If it does exist and anyone has used it (or them), please include your experiences with it in the answer.

    Read the article

  • Why google return soft 404 when I redirect on signup page?

    - by Hettomei
    Since one month, I've got an increased "soft 404" reported by google webmaster tools but work well for users. I made some fix but can't figure out how to solve it. Configuration (maybe useless): I have a website built with rails 3.1 Authentication is handled by the gem Devise Problem: On this page http://en.bemyboat.com/yacht-charter/9965-sailboat-beneteau-oceanis-43 when you click on "Ask a Boat request" (a simple form, in GET to : http://en.bemyboat.com/boat_requests/new/9965) you are redirected with the http status 302 to sign in, and then sent back to the new page if successfully sign in. Google tells me that the link on "ask a boat request" returns a soft 404. I can't make this form in "POST" (which will solve the problem) because we need to automatically redirect user to the good page after sign in. (the Gem Devise memorize the "get" link) To simplify, the question is: how to protect a private page with authentication, reached with a simple "get" and not to be penalized by google with "soft 404". Thank you. PS : this website suffer a lot about english translation... please don't care.

    Read the article

  • Does Google sometime prevent new white hat sites from ranking at all in some verticals?

    - by JVerstry
    Assuming someone wants to implement a new viagra or akai berry e-commerce website. There is a lot of competition and this site does not really bring something new, other than a new online counter to buy products at a nice price. Assuming this site does not use any black hat techniques and that it stays with Google quality guidelines, and assuming it has no (or few) backlinks (from non-authoritative websites). Assuming this website's pages are indexed properly in Webmaster Tool, and that no penalties are reported. No site improvements are suggested. Google crawls the site daily as reported in GWT. No robots.txt configuration issues. Does Google sometime decide to no rank this site for any user query (for weeks), because of lack of original content? The reason I am asking this is that I am trying to understand the possible cause of a similar situation I am observing with two sites. If so, what is the way out to start ranking for these site? If not, does it mean the cause is elsewhere for sure? Any confirmed info to get out of the maze is welcome.

    Read the article

  • Grapeshot crawler ignoring robots.txt

    - by QF_Developer
    Has anyone come across a crawler called Grapeshot? They are hammering the same page repeatedly on our website. I believe they are looking for ad related keywords, based on previous content ad campaigns. The odd thing is we never ran any such campaigns on the page they are so interested in. We do have only a few pages running AdSense, is this what has attracted Grapeshot? I've added the following declaration to my robots.txt, but they don't seem to be honouring it? User-agent: grapeshot Disallow: / Any ideas on how to block this nuisance crawler? I'm starting to think the best way is by setting up IP rules in IIS?

    Read the article

  • How can I convince IE to honor my explicit instructions to make a table column X pixels wide? [migrated]

    - by AnthonyWJones
    Please consider this small but complete chunk of HTML: <!DOCTYPE html > <html> <head> <title>Test</title> <style type="text/css"> span {overflow:hidden; white-space:nowrap; } td {overflow:hidden; text-overflow:ellipsis} </style> </head> <body> <table cellspacing="0" > <tbody> <tr> <td nowrap="nowrap" style="max-width:30px; width:30px; white-space:nowrap; "><span>column 1</span></td> <td nowrap="nowrap" style="max-width:30px; width:30px; white-space:nowrap; "><span>column 2</span></td> <td nowrap="nowrap" style="max-width:30px; width:30px; white-space:nowrap; "><span>column 3</span></td> </tr> </tbody> </table> </body> </html> If you render the above in Chrome you'll see the effect I'm looking for. However render it in IE8 or 9 the width and/or max-width is ignored. So my question is how do get IE to simply let me specify the width of a cell explicitly? BTW, I've tried various combinations of table-layout:fixed and using colgroup with cols and all sorts, nothing I've tried convinces IE to what I'm clearly asking it to explicitly do? If I had any hair before starting this I wouldn't have any left by now.

    Read the article

  • Redirect AFTER Initiating download

    - by mashup
    I have a question - is there any way to initiate a download and AFTER the user has confirmed the download then redirect to another site? Is something like that possible via ASP or another language commonly used for websites? Bad PHP "user experience" scenario (In use right now) a) User comes to site, clicks download button b) Users sees "download" landing page, gets redirected after 5 seconds c) Download starts on Thankyoupage Good "user experience" scenario: (my dream solution, what I want) a) User comes to site, clicks download button b) Download starts immediately on landing page c) Download confirmed, redirects now to thank you page Any programming language is a go for this.

    Read the article

  • Letting search engines know that different links to identical pages stress different parts of the page

    - by balpha
    When you follow a permalink to a chat message in the Stack Exchange chat, you get a view of the transcript page for the day that contains the particular message. This message is highlighted in yellow, and the page is scrolled to its position. Sometimes – admittedly rarely, but it happens – a web search will result in such a transcript link. Here's a (constructed, obviously) example: A Google search for strange behavior of the \bibliography command site:chat.stackexchange.com gives me a link to this chat message. This message is obiously unrelated to my query, but the transcript page does indeed contain my search terms – just in a totally different spot. Both the above links lead to the same content, and Google knows this, since both pages have <link rel="canonical" href="/transcript/41/2012/4/9/0-24" /> in their <head>. The only difference between the two links is Which message has the highlight css class?. Is there a way to let Google know that while all three links have the same content, they put an emphasis on a different part of the content? Note that the permalinks on the transcript page already have a #12345 hash to "point" to the relavant chat message, but Google appears to drop it.

    Read the article

  • Issues with web hosting at home

    - by hari
    I want to host a small personal website at home. One basic problem I am hitting is, From inside home network, I cannot access my domain name. I have to use the local ip (something like 192.168.1.4) to access the website. This ip is the desktop which is hosting the website. Because of this mapping, I have issues setting up a simple wordpress blog on it too. How do I get past this issue? edit:0 when I try to access www.example.com (my domain) from within my home network, I get redirected to my router login. PS: 1) I am using dyndns service to map my non-static ip to my domain name. 2) My portforwarding works fine.

    Read the article

  • Sharing unique links on social media vs SEO

    - by MJWadmin
    We're currently implementing a voucher system on our site which will allow our users to obtain a 25+% discount on certain products, provided they donate 10% of the purchase price to charity. We will offer the ability to share the discounts via social media in return for larger discounts to the sharer for each person who clicks through the link and buys an item. I understand that social links have SEO benifits, but this appears to be based on lots of people sharing the same link. If our voucher users share a unique link i.e. http://ourdomain.com/sipsfesdf rather than a fixed link http://ourdomain.com/product-name will we still receive the same benifts? Should we instead share something like http://ourdomain.com/product-name/sipsfesdf Thanks in advance.

    Read the article

  • Best way to track multiple sites with Google Analytics

    - by stevether
    I currently have 63 websites (and counting) that I'm tracking on one Google Analytics account, and I'm starting to realize... this is becoming a bit cumbersome. What's the best way to collect traffic data in bulk? Are there other resources out there that are better suited for this task? Does Google offer a bulk option for this kind of thing? Would it be better to make separate analytics accounts? I'm just wondering if anyone else has had found a better solution that manually setting up all these accounts/setting up the tracking codes etc, when it comes to large scale management.

    Read the article

  • Php profiling on production server or other options

    - by absentx
    Alright I need some help here. I am commonly asked to speed up certain sections of some websites that I program for. I have yet to be able to figure out how to use a good php diagnosis/profiling tool. Some things to consider: The sites I am working on are already built, getting a testing server set up locally is just a huge pain..I have to rewrite include paths and just so many things. This is a results oriented deal and spending days to get a site fully working on a testing platform so I can debug one page probably isn't an option. I can write tons of php, but I have no clue how to interact or mess with servers. So every tutorial I read about setting up xdebug or xhprof all seem to involve getting something installed on a production server that I don't have access to or have no clue how to work with. So are there any solutions out there that will show me where my php is slow without having to do all sorts of server stuff that I just don't know how to do? Xhprof seems to be the closest to useable for me but from what I can tell it still has to be installed on a server. If anyone can just point me in the right direction on this I would be very grateful. Maybe getting these things put on the server isn't a big deal...but I have never interacted with server command lines or anything like that. I suppose I should start sometime but I really have no idea where to start. Plus I realize that profiling on a live platform is not the greatest idea either but I feel I am in a tough spot. I have speed issues to solve and setting up a local environment while a great idea, just doesn't seem real practical at the moment.

    Read the article

  • Are you aware of any client-side malware that sends lots of junk requests for .gifs?

    - by Matt Sherman
    I am getting dozens of 404 errors on my site that are requests for gif's with apparently random names, like 4273uaqa.gif and 5pwowlag.gif. I see that most of them are coming from one user. I assume something is happening in the background on her machine without her knowledge -- a malware thing on the client. Have you seen this behavior before, and do you know what sort of malware might cause it? Would love to advise my customer that s/he has an issue. I'd also like to stop getting these 404 reports. (reposted from main Stack Overflow)

    Read the article

  • Google Analytics show zero for "Search Engine Optimizations" graph

    - by Saeed Neamati
    In Google Analytics new design, there is an area related to the queries and impressions related to your site. You can get there by following Traffic Sources = Search Engine Optimization = Queries. However, it now shows zero for the "Site Usage" graph, at the top section, while other areas of Google Analytics definitely show that site has visitors and has been used. No matter how much I search, I can't find the source of the problem. Does anyone know where the problem might be?

    Read the article

< Previous Page | 225 226 227 228 229 230 231 232 233 234 235 236  | Next Page >