Search Results

Search found 5380 results on 216 pages for 'webmasters'.

Page 88/216 | < Previous Page | 84 85 86 87 88 89 90 91 92 93 94 95  | Next Page >

  • PHP framework suggestions

    - by user1104791
    I'm going to be creating a site for a business with the following: Single Item purchases with Paypal, Google Checkout and Amazon Checkout Digital Downloads for members from a CDN Forum Voting on different items While that seems rather simple having to deal with the three different payment gateways is a big pain in the ass. Django has a great payment library which works for all three but I'm not able to find one for any of the PHP frameworks. Anyone have any suggestions? I'm posting here as suggested by someone at Stackoverflow.

    Read the article

  • What is happening to a domain with status PENDING DELETE as well as AUTORENEWPERIOD?

    - by Alex Angas
    A domain I once registered but gave away: Expiration Date:13-Mar-2013 14:39:45 UTC Sponsoring Registrar:Directi Internet Solutions Pvt. Ltd. dba PublicDomainRegistry.com (R159-LRMS) Status:PENDING DELETE RESTORABLE Status:HOLD Status:AUTORENEWPERIOD Status:REDEMPTIONPERIOD Registrant ID:DI_7838158 Registrant Name:scherhag Registrant Organization:Locafroid europe sa I'd like to get this domain back and I'm hoping that PENDING DELETE means it might soon be released to the market. However I'm not sure, as there is also a status that says AUTORENEWPERIOD?

    Read the article

  • RewriteRule not working for certain URLs

    - by keiki
    There are a few domains pointing towards the same server, and of course I need them all redirect to only one of them. Redirects work, but only for certain URLs. What works: http://www.domain.com, http://domain.com, domain.com/index.html, domain.com/index.php, , domain.com/nonExistentDirectory, and if I click in the menu the following URLs are also redirected correctly: domain.com/foo/bar, domain.com/foo/bar.html or .php or other extension. What doesn't work: domain.com/existentDirectory, domain.com/foo/bar (if I type the URL in the address bar). If anyone will have the time and skill and will to tell me where's the mistake, I'll be deeply grateful. Here's my .htaccess file: AddHandler x-httpd-php .html .htm <ifModule mod_gzip.c> mod_gzip_on Yes mod_gzip_dechunk Yes mod_gzip_item_include file \.(html?|txt|css|js|php|pl)$ mod_gzip_item_include handler ^cgi-script$ mod_gzip_item_include mime ^text/.* mod_gzip_item_include mime ^application/x-javascript.* mod_gzip_item_exclude mime ^image/.* mod_gzip_item_exclude rspheader ^Content-Encoding:.*gzip.* </ifModule> <ifModule mod_expires.c> ExpiresActive On ExpiresDefault "access plus 1 seconds" ExpiresByType text/html "access plus 1 seconds" ExpiresByType image/gif "access plus 2592000 seconds" ExpiresByType image/jpeg "access plus 2592000 seconds" ExpiresByType image/png "access plus 2592000 seconds" ExpiresByType text/css "access plus 2592000 seconds" ExpiresByType text/javascript "access plus 216000 seconds" ExpiresByType application/x-javascript "access plus 216000 seconds" </ifModule> <ifModule mod_headers.c> <filesMatch "\\.(ico|pdf|flv|jpg|jpeg|png|gif|swf)$"> Header set Cache-Control "max-age=2592000, public" </filesMatch> <filesMatch "\\.(css)$"> Header set Cache-Control "max-age=2592000, public" </filesMatch> <filesMatch "\\.(js)$"> Header set Cache-Control "max-age=216000, private" </filesMatch> <filesMatch "\\.(xml|txt)$"> Header set Cache-Control "max-age=216000, public, must-revalidate" </filesMatch> <filesMatch "\\.(html|htm|php)$"> Header set Cache-Control "max-age=1, private, must-revalidate" </filesMatch> </ifModule> <ifModule mod_headers.c> Header unset ETag </ifModule> FileETag None <ifModule mod_headers.c> Header unset Last-Modified </ifModule> # BEGIN WordPress <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> # END WordPress RewriteCond %{HTTP_HOST} ^foo\.com$ [OR] RewriteCond %{HTTP_HOST} ^www\.foo\.com$ RewriteRule (.*) http://domain.com/$1 [R=301,L,QSA] RewriteCond %{HTTP_HOST} ^foo1\.com$ [OR] RewriteCond %{HTTP_HOST} ^www\.foo1\.com$ RewriteRule (.*) http://domain.com/$1 [R=301,L,QSA] RewriteCond %{HTTP_HOST} ^foo2\.com$ [OR] RewriteCond %{HTTP_HOST} ^www\.foo2\.com$ RewriteRule (.*) http://domain.com/$1 [R=301,L,QSA] RewriteCond %{HTTP_HOST} ^foo3\.com$ [OR] RewriteCond %{HTTP_HOST} ^www\.foo3\.com$ RewriteRule (.*) http://domain.com/$1 [R=301,L,QSA] RewriteCond %{HTTP_HOST} ^foo8\.com$ [OR] RewriteCond %{HTTP_HOST} ^www\.foo8\.com$ RewriteRule (.*) http://domain.com/$1 [R=301,L,QSA] Thinking that the above version was overkill, I've also tried to redirect all the requests for domains different than the main on to be redirected to it like this: RewriteCond %{HTTP_HOST} !^domain\.com$ [NC] RewriteRule ^(.*)$ http://domain.com [L,R=301] Is it also wrong? Because it doesn't work either! P.S. @Sodved I've tried that and it doesn't help (I comment here because I can't seem to be able to comment your answer.) Removing the following piece of code didn't solve the issue either, so the problem must be somewhere else: # BEGIN WordPress <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> # END WordPress New details: using this tool for checking the redirects I got the following results for the URLs that are not redirected: Checked link: http://domain.com/aDirectory/ Type of link: direct link (note the trailing slash above) and: Checked link: http://domain.com/aDirectory Type of redirect: 301 Moved Permanently Redirected to: http://domain.com/aDirectory/ (no trailing slash here) I hope/suspect I'm getting closer to the cause of this behavior.

    Read the article

  • Worth changing the URL structure to incorporate keywords?

    - by Dejan Pelzel
    I am migrating my blog from PHP to ASP.NET and while recoding the whole website, I figured I might as well improve the URL structure. This is how an url looks like now: example.com/blog/post/755/hakurei-reimu-cosplay-from-touhou-by-kishigami-hana and this is hould it will look after the change (cosplay being the dynamic main keyword of the post): example.com/blog/cosplay/hakurei-reimu-cosplay-from-touhou-by-kishigami-hana-755/ The website is a bit more than a half year old and receives around 650k page views a month, mainly from search traffic. Of course everything would be redirected with 301 redirects. Do you think it is worth changing to a new URL structure, or will it harm the ranking in the long run?

    Read the article

  • New host, high load?

    - by dotancohen
    A few minutes ago I signed up at a new webhost. I have yet to move my sites over. Upon initial SSH connection, I checked the load and memory usage, they do seem rather higher than I would like: # uptime 12:06:51 up 71 days, 23:23, 1 user, load average: 9.02, 9.49, 9.45 # free total used free shared buffers cached Mem: 33014800 31927192 1087608 0 2384812 17729816 -/+ buffers/cache: 11812564 21202236 Swap: 16787916 8584 16779332 Is that a bit to packed? I'm only paying about $5 USD per month, so I don't expect <0.1 loads, but ~10 is worrisome. Is it not? Also, there is no /etc/issue file so I tried other methods to guess the OS: # uname -a Linux box358.bluehost.com 2.6.32-20120131.55.1.bh6.x86_64 #1 SMP Tue Jan 31 15:43:27 EST 2012 x86_64 x86_64 x86_64 GNU/Linux # which yum /usr/bin/yum # which apt-get # That looks like CentOS / RHEL 6.2 possibly?

    Read the article

  • What is recommended minimum object size for gzip benefits?

    - by utt73
    I'm working on improving page speed display times, and one of the methods is to gzip content from the webserver. Google recommends: Note that gzipping is only beneficial for larger resources. Due to the overhead and latency of compression and decompression, you should only gzip files above a certain size threshold; we recommend a minimum range between 150 and 1000 bytes. Gzipping files below 150 bytes can actually make them larger. We serve our content through Akamai, using their network for a proxy and CDN. What they've told me: Following up on your question regarding what is the minimum size Akamai will compress the requested object when sending it to the end user: The minimum size is 860 bytes. My reply: What is the reason(s) for why Akamai's minimum size is 860 bytes? And why, for example, is this not the case for files Akamai serves for facebook? (see below) Google recommends to gzip more agressively. And that seems appropriate on our site where the most frequent hits, by far, are AJAX calls that are <860 bytes. Akamai's response: The reasons 860 bytes is the minimum size for compression is twofold: (1) The overhead of compressing an object under 860 bytes outweighs performance gain. (2) Objects under 860 bytes can be transmitted via a single packet anyway, so there isn't a compelling reason to compress them. So I'm here for some fact checking. Is the 860 byte limit due to packet size the end of this reasoning? Why would high traffic sites push this lower/closer to the 150 byte limit... just to save on bandwidth costs, or is there a performance gain in doing so?

    Read the article

  • Should I add a "nofollow" attribute to download links, or disallow the URLs in robots.txt?

    - by Laurent
    I have a download link very similar to Opera's one - it's just a script that sends the file. It doesn't have an extension and there's no obvious way to tell that it's actually a download link. So since I don't want robots to crawl this link, do I need to add it to robots.txt or maybe add a "nofollow" attribute to it? I see that on Opera's website they didn't do either of this, so perhaps it's not necessary?

    Read the article

  • What does the impression and ctr means in google webmaster

    - by KoolKabin
    I am checking google webmaster tools. I entered the search queries section. There i found alot keywords and their impression and ctr etc. I clicked on one of the query keyword there it shows the keyword and position in search result, but when i go to google.com and type the specified keyword it shows no impressions too... how do i measure find my site's impression on google.com my site: http://www.trekkingandtoursnepal.com keyword: trekking nepal

    Read the article

  • How to make google analytics report on two domains as though they are one site?

    - by Ben
    We have a main site that Google analytics is currently running fine on (www.ourcompany.com). We have a page that is technically part of the site (same design, etc.) but is hosted on another server/domain for various business reasons (www.ourparentcompany.com/ourcompanyapp/). Do we just add the normal google analytics code to the bottom of that page? Or is there something more we have to do? If there isn't anything more then couldn't anyone just take your GA code and start reporting analytics to your profile from their site?

    Read the article

  • django & postgres linux hosting (with SSH access) recommendations

    - by Justin Grant
    We're looking for a good place to host our custom Django app (a fork of OSQA) and its postgresql backend. Requirements include: Linux Python 2.6 or (ideally) Python 2.7 Django 1.2 Postgres 8.4 or later DB backup/restore handled by the hoster, not us OS & dev-platform-stack patching/maintenance handled by the hoster, not us SSH access (so we can pull source code from GitHub, so we can install python eggs, etc.) ability to set up cron jobs (e.g. to send out dail email updates) ability to send up to 10K emails/day good performance (not ganged up with a zillion other sites on one CPU, not starved for RAM) FTP or SCP access to web logs dedicated public IP SSL support Costs under $1000/month for a relatively small site (<5M pageviews/month) Good customer service We already have a prototype site running on EC2 on top of a Bitnami DjangoStack. The problem is that we have to patch the OS, patch postgres, etc. We'd really prefer a platform-as-a-service (PaaS) offering, like Heroku offers for Rails apps, where all we need to worry about is deploying our code instead of worrying about system software patching and maintenance. Google App Engine is closest to what we're looking for, but they don't offer relational DB access (not yet at least). Anyone have a recommendation?

    Read the article

  • Adsense Page impression is showing zero

    - by user2687
    Hi, i have created a new adsense account for my blog [link removed as is loaded with ads, popups and who knows what kind of XSS] and integrated ads with my blog. But i am getting page impressions zero. i am getting daily analytic's of 300 to 400 visitor/day still page impression are coming zero only. i have verified my pub-id. i don't no what is the problem. how to contact Google adsense team regarding this. please help me out in this. -Thanks

    Read the article

  • Do CDNs work with POST operations?

    - by iddqd
    I'm using a CDN (Level3) for the first time and I'm a bit confused. I'm accessing dynamic URLs such as http://cdn.mysite.com?getItem=1234 that return text data. Do CDNs work with HTTP POST operations? When i issue a HTTP POST operation, my "real" server receives this request every time, so I'm wondering if the CDN has a problem with POST operations. If i use HTTP GET it seems to work, i call the URL once (from my application), i can see my server receiving the request. If i call it a second time, the CDN delivers it directly, my server doesn't get anything. However if i open same the link manually from a second browser tab, my server is asked to deliver again, shouldn't it be cached by now? Many thanks.

    Read the article

  • Server overhead caused by bots?

    - by giuseppe
    I have one customer website causing overhead (http://www.modacalcio.it/en/by-kind/football-boots.html). With htop opened, I am trying navigate the website and the much load of the website is done by the ajax link being placed on the left side of the website. The website is hosted by a VPS with 3 proc and 2GB RAM, with enough hard with disk space. The real problem is that this website is new and not visited much. From the http-status module I am seeing that the overhead is caused by bots (Google bots, Bing bots, hrefs checker and so on). So I thought that's probably due to those spiders trying to crawl all those links at once - could this be causing this overhead? I have also put rel="nofollow" in those links, but this doesn't keep the bots away. Is there any way through code or Plesk to disable those links to those bots?

    Read the article

  • Infrastructure to effectively set up experiements and learn from them

    - by David
    Open-org.com is in the early stages of creating our first product, a place on the web, where one can ask lawyers questions at a fraction of their normal costs. An early stage front page can be found here. I got inspired by this video, which is recommended by Jeff Atwood, which talks about getting feedback faster, which is the reason for this question. The problem Needless to say, we want our conversion rates to be as high as possible. Therefore, we want to be able to rapidly set up a new experiment where we change something on the site (like moving an image slightly, rewriting a sentence etc.). We then want to present the modified page to a random subset of the users. After that we will compare the conversion rates of the experiment with another version. I could very well imagine that we want to run 10-100 experiments simultaneously and it would be nice to have features, where experiments that obviously have worse results will be ended before schedule. My question Does infrastructure to support the whole process exist? A short description of our infrastructure... We use EC2 and PHP and have a script to automatically start up new instances with all needed software. Still, starting up a new server for every experiment, seems like a bit of overkill, so I am wondering what other options exist. Btw. If you feel like working for Open-org.com, you can pick a task, and start working, or suggest a new task. All profits are given out to the contributors.

    Read the article

  • Should I get an SGC enabled SSL certificate?

    - by Simon
    I'm in the market for a new SSL certificate and am wondering if I should get an SGC enabled certificate or not? In the past I have just used cheap SSL certificates but since this is for a new company website I want to make sure I have the best but I am unsure whether it is worth paying the extra. The documentation states that it just enables older browsers to use 128 bit encryption when they would normally only be able to use 40 or 56 bit encryption. Would you pay the extra for older browsers which are likely to be extremely rare?

    Read the article

  • Programmatic removing Exit Popup from Page? [closed]

    - by Jose Garcia
    I have a page A which has exit popup. I want it to be show on Page B. I used iframe for displaying page A on B. Edit:Page A is having a Exit Popup which i dont want in Page 2. But Page A is having annoying popup. Assuming i can't edit Code of Page A. Can i just make some code in my page B . To remove Exit Popup? Please provide me with sample code. I would prefer it to run on My Lamp Shared hosting. I can use anything in place of Iframe if need be. Thanks.

    Read the article

  • how to fix bad seo after being hacked

    - by mkprogramming
    About a year ago my wordpress website was hacked & some company decided to go nuts and actually do some "SEO" on the various links it created. Some of the pages would show up on google as "payday cash advance" instead of "portfolio". The issue has been resolved, but now as I've been doing GOOD seo, I've noticed (when checking backlinks) that there are TONS of links still on the internet (mostly broken sites now) that have links to my website with titles like: "get a loan today" and so on. Is there a way to remove these links ? Can I tell google to ignore them ? Help !

    Read the article

  • ASP.NET website deployment [on hold]

    - by Rei Brazilva
    I am getting my hands wet with ASP and I have been following the tutorials. I deployed the site and in Azure and it worked great. Today I started actually designing the site. And when I published, it looks as if it doesn't read any of the files I just updated, added, and modified. It works on my localhost, but not in the Azure. I thought when you publish, everything goes up, including the new files. I don't have enough reputation to add a picture so, you'll forgive me. SO, basically, how do I get my entire site uploaded? In case anyone does stop by, I was able to pull this out just recently: CA0058 Error Running Code Analysis CA0058 : The referenced assembly 'DotNetOpenAuth.AspNet, Version=4.0.0.0, Culture=neutral, PublicKeyToken=2780ccd10d57b246' could not be found. This assembly is required for analysis and was referenced by: C:\Users\lotusms\Desktop\LOTUS MARKETING\ASP.NET\WebsiteManager\WebsiteManager\bin\WebsiteManager.dll, C:\Users\lotusms\Desktop\LOTUS MARKETING\ASP.NET\WebsiteManager\packages\Microsoft.AspNet.WebPages.OAuth.2.0.20710.0\lib\net40\Microsoft.Web.WebPages.OAuth.dll. [Errors and Warnings] (Global) CA0001 Error Running Code Analysis CA0001 : The following error was encountered while reading module 'Microsoft.Web.WebPages.OAuth': Assembly reference cannot be resolved: DotNetOpenAuth.AspNet, Version=4.0.0.0, Culture=neutral, PublicKeyToken=2780ccd10d57b246. [Errors and Warnings] (Global) Could this have something to do with the problem?

    Read the article

  • Tag link suggestion plugin for wordpress?

    - by Emerson
    Hi, every time I write a post I make sure I add links to wordsthat I have tags for. For example: "The economy of Brazil has improved in the last few years" this ensure that when people re-post my content, a lot of back-links will be created to my tags. This is quite a lot of work to do manually for every post. It would be cool if there was a plugin that would suggest tags to be applied when they match existing words in the text of the post. Is there such a thing?

    Read the article

  • Group "Discussion" software?

    - by Kayle
    My client wants a "lite" forum... not unlike these stack exchange sites, but even a little lighter. There's a screenshot of the discussion group she likes most, below. You can also go here to see it for yourself it you like. I don't think traditional forum apps will display, functionally, in this manner. Is there any software I can use to get a similar result? A web service would be acceptable as well.

    Read the article

  • I see files in filezilla, but the internet denies their existance

    - by Zach L.
    I am doing some updates to a 10-year old site, and I am baffled. Everything worked great locally, so I uploaded a bunch of stuff to the server using filezilla. Within filezilla I can see all of the files, but for some reason I get a 404 when trying to view them. It seems as though (at least for the folder Im currently checking) this is happening for items which are "farther down the list" alphabetically. I tried to re-upload a file individually but it didn't change anything. Is this an indication that I hit some sort of limit with the hosting company? And if so why can I still view the files from filezilla? Please offer guidance. I am at a loss.

    Read the article

  • Is there a general rule of thumb for which browsers to optimize your site for?

    - by Christian
    I have a site (recently relaunched it with a new design) that I have put off optimizing for ie7 for far too long. I was just never too worried about it. The site is optimized for ie8-10, Firefox, Chrome, Opera, Safari, etc.. Then I asked myself, is it even worth it? I checked traffic over the last couple months before the relaunch and about 1.3% of the traffic is coming from ie7. So, is there a general cuttoff percentage when you would not optimize for a specific browser?

    Read the article

  • How to Fix this specific Google "Fetch as Googlebot" error appearing on my Webmaster Tools?

    - by UXdesigner
    Good day, I'm currently finding out why I have lost all of my website's rank in google. I don't even appear in google results by the domain. But other sites do link me and they appear in the google results. I think it's all about leaving my site two months alone and finding out I had 20k in comment spam, which I completely deleted and fixed with filters and adding a new Disqus comment service. Thing is, I added my site to Google Webmaster Tools and I'm finding out several awful things. For example, when I click in Google Fetch As GoogleBot. I receive this error message below in response to my request. And I don't even know what's the real problem and how to fix it. I simply don't get it. This is what appears: Date: Wednesday, July 20, 2011 9:43:35 AM PDT Googlebot Type: Web Download Time (in milliseconds): 55 HTTP/1.1 403 Forbidden Date: Wed, 20 Jul 2011 16:43:36 GMT Server: Apache Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 248 Keep-Alive: timeout=2, max=100 Connection: Keep-Alive Content-Type: text/html; charset=iso-8859-1 403 Forbidden Forbidden You don't have permission to access / on this server. Additionally, a 403 Forbidden error was encountered while trying to use an ErrorDocument to handle the request. Do you guys know anything about this problem ? I need to have Google crawl my site again. I used to have a really nice google result in the past three years. Now, there's nothing. thanks,

    Read the article

  • What data to send when tracking clicks with Google Analytics events (and how)?

    - by user359650
    When tracking clicks on links, there are 3 items I'm interested in: link location in the page by grabbing the id of the closest parent: to see influence of location on click-through link text: to see influence of text on click-through link href attribute value: to see where people go when leaving my website The problem when using Google Analytics to track those clicks is that events only have 3 available text fields, one of which being the category, which if you use to store one of the above items will create a mess in your Event reporting because you will have as many categories as item values. Therefore if you assign a predefined value to the category (e.g. clicks), then you're left with only 2 event fields (action, label) to store 3 items (location, text, href). That in itself isn't the end of the world because you can concatenate 2 items into 1 event field, then use the reporting or the API to filter things out. Accordingly what I plan on doing is this: category: clicks action: {location_on_page} ¦ {text} label: {href} where {__} are variable values related to the clicked links With this I can easily create some reports directly via the GUI: downloads: include only events where label ends with .pdf click outs to particular domains: include only events where label contains domain And for more complex tasks I need to export the data (or use the API): influence of location on clicks: for each location in the design, count number of events that have that location in the action, then corroborate with pageviews of the corresponding pages. Whilst this looks good I'm wondering if there is a better approach, hence the following questions: Q1: Can you foresee any particular issues with this particular setup (e.g. things I won't be able to report on)? Q2: Can you think of other data that would be interesting to include in the event?

    Read the article

  • TinyMCE autoresize plugin not works

    - by user31929
    I want to reproduce this simply behaviour : http://tinymcesupport.com/tutorials/autoresize-automatic-resize-plugin This is my init: <!-- TinyMCE --> <script type="text/javascript" src="js/jscripts/tiny_mce/tiny_mce.js"></script> <script type="text/javascript"> tinyMCE.init({ mode : "exact", elements : "pagina_testo_colonna1,pagina_testo_colonna2,pagina_testo_colonna3", theme : "advanced", plugins:"paste,autoresize", plugin_preview_width : "100%", width : "100%", theme_advanced_buttons1 : "pastetext,|,bold,italic,underline,strikethrough,|,bullist,numlist,|,indent,outdent,|,undo,redo,|,justifyleft,justifycenter,justifyright,justifyfull,|,link,unlink,|,charmap", theme_advanced_buttons2 : "", theme_advanced_buttons3 :"", theme_advanced_disable : "image,anchor,cleanup,help,code,hr,removeformat,sub,sup", theme_advanced_resizing : true, paste_text_use_dialog : true, relative_urls : false, remove_script_host : false }); </script> <!-- /TinyMCE --> i have added "autoresize" to plugins list but my editors not resize while i writing, they simply scroll. I have multiple editor in the same page. What's wrong with my code?

    Read the article

< Previous Page | 84 85 86 87 88 89 90 91 92 93 94 95  | Next Page >