Search Results

Search found 9717 results on 389 pages for 'pro'.

Page 227/389 | < Previous Page | 223 224 225 226 227 228 229 230 231 232 233 234  | Next Page >

  • GZipped Images - Is It Worth?

    - by charlie
    Most image formats are already compressed. But in fact, if I take an image and compress it [gzipping it], and then I compare the compressed one to the uncompressed one, there is a difference in size, even though not such a dramatic difference. The question is: is it worth gzipping images? the content size flushed down to the client's browser will be smaller, but there will be some client overhead when de-gzipping it. Please advise.

    Read the article

  • Why I lose my page rank after 301 redirect?

    - by rajesh.magar
    As we all know Google treats sub-domains as completely separate domains so we have to fight for both, to get ranked in search results. One of my client website was like they having example.com and blog.example.com. So in mind to keep all stuff in one place we redirect blog.example.com to example.com/blog/ But in this case we lost our pagerank and are still wondering where we went wrong or it just takes few more time to showoff. So what is the reason behind this?

    Read the article

  • Dynamically change page content based on URL parameter?

    - by volume one
    The title of my question seems simple but here is an example of what I want to do: http://www.mayoclinic.com/health/infant-jaundice/DS00107 What happens on that page is whenever you click on a link to go a section (e.g. "Symptoms") in the article on "Infant Jaundice", it provides a URL parameter like this: http://www.mayoclinic.com/health/infant-jaundice/DS00107/DSECTION=symptoms As the DESCTION parameter changes, you get different content on the same page DS00107. The content changes as well as <meta keywords>. Can someone please tell me how this is achieved? I was thinking it was an if/else situation programmed into the page itself to display different properties depending on the URL parameter. Any help or suggestions are very much appreciated and my thanks to you for reading my question.

    Read the article

  • How to manually list set of urls for search engines to index

    - by MarutiB
    So I have created a video website which has thousands of videos and thousands of videos get added to it on a daily basis. Here is my problem :- I have created a website which basically loads the skeleton in html and puts all the content through javascript and Ajax. The problem is search engines aren't going anywhere except for the home page. Is there a way say in robots.txt where i give a link to a single html which has links to all these videos ? I agree my site is not accessible for a non-javascript user but stats show that this ratio is very low ( 0.2 % ). Is there a way I can still keep the complete AJAX website and still get each individual videos listed on google ?

    Read the article

  • Form development optimization

    - by Juan
    Like many web developers I do forms all the time. I found myself doing the same all the time: placing input fields, assigning a name to each, ajax the form, then create the PHP which involves to assign a PHP var to each $_REQUEST['var'], escape and validate data, build the html and emailing the results. So I found that 70% of the work is duplicated but I just can't duplicate a page and change the fields. I end up wasting more time reformatting, deleting and adding different fields than creating from scratch. I started planing to program a "list of IDs to html+php" converter in which I'd input all the IDs and this would output the basic html and php. Then I thought: there's got to be thousands of developers that go through this, I'd be reinventing the wheel. So this is my question, I'm trying to find that wheel that somebody must have invented already. I found this: http://www.trirand.com/blog/jqform/ which does more or less what I'm looking for but it's an expensive solution and it has too much functionality for what I'd be using it. Which tools do you use to optimize repetitive task about HTML and PHP?

    Read the article

  • Why Can't Computers Off My Network See the Site? [migrated]

    - by nmagerko
    Have just set up Apache, PHP, MySQL, etc. on my Ubuntu OS, and I was wondering why computers that are not on my network can not see the basic index.html that Apache uses as the default. I set up the static ip address for my computer, and I use 192.168.1.100 for computers to view the simple site. Is there something I am missing that will allow others to access my site? (It is REALLY simple; no graphics, CSS, etc.)

    Read the article

  • Using Drupal to build a directory listing? [on hold]

    - by Jim
    I am trying to create a form inside Drupal that will allow me to create a directory similar to this diagram: http://i.imgur.com/EtChBbG.jpg I tried looking into HTML tables but it is too basic for what I'm trying to do. How do I create a directory that can archive data in an alphabetical ordering? It also has to be able to sort by letters and other categories. Does anyone have an idea of how I should go about doing this? Thanks!

    Read the article

  • Can I redirect the HTTP request towards an old folder to the homepage using .htaccess file?

    - by AndreaNobili
    I have to following situation: I had an old blog that was made using Joomla (this blog was indexed well enough by search engines). For some problems I delete it and I have create it again using WordPress. Now I have many visit (from Google) that leading to specific pages of the old site (pages that don't exist in the new version). For example I have visit to URL as: /scorejava/index.php/corso-spring-mvc/1-test that don't exist on my new site. I would know if using the .htaccess file (or other sistem) I can redirect the HTTP request directed to some subfolder (that don't exist in the new version) to the homepage of my new site. For example I have the request towards the void URL: /scorejava/index.php/corso-spring-mvc/1-test. And I would create a regular expression that say something like: all the request toward the subfolder corso-spring-mvc (and all it's content file and subfolder) have to be redirected to www.scorejava.com. Is it possible?

    Read the article

  • What is the best stucture of SEO friendly URL?

    - by Aajahid
    I'm working for a website to convert the website URL to an SEO friendly URL. I plan to use this: mysite.com/category-name/pageid-123-page-name I looked at some similarly categorized, highly ranked websites. They have the same structure, except for one thing. In one case, the URL format was thissite.com/category-name/pageid-123-page-name.html Another was thatsite.com/category-name/pageid-123-page-name.php Now I know the text in URLs help with SEO. Is it more helpful to have a file extension? If yes, which one is better? Or if my current plan is okay, will it be better with a / at the end?

    Read the article

  • How to push through a domain transfer in spite of the 60 day rule

    - by corsiKa
    I recently purchased a domain through a registrar which I won't name here. Within the first five minutes of logging in, I found a severe vulnerability that allows me access to all registration details of all users. Simply put, I do not trust this registrar with any kind of business. But I'm unable to transfer the domain because, for some reason, it has to exist in its current state for 60 days. We're planning to launch the site this weekend - we can't wait 60 days. But I can not trust this registrar: if I found such a severe vulnerability in the first few minutes, how many more similar un-trustables will I find in those 60 days? Is there a higher authority to whom I can submit a case to get my domain transferred to a different registrar?

    Read the article

  • More than 5 custom variables across multiple websites using Google Analytics

    - by brakes
    We have multiple websites using the same Google Analytics account number so we can track visitors across multiple websites. One of these websites has set 5 custom variables. We want to introduce a new custom variable to track logged in users for our single sign-on (SSO) system to find out what parts of which website they are accessing. Is this possible or is it a case that all the custom variables have been used up by 1 of the sites?

    Read the article

  • How to redirect a international domain to a subfolder on the English site without hurting Google rankings?

    - by ernest1a
    I have two sites: www.example.de www.main.com www.main.com is English version of www.example.de which is in German. I want to keep only www.main.com. For the English version I will keep www.main.com, but for German I want to move it to www.main.com/de. I am wondering what would be best solution for old www.example.de: Redirect everything from www.example.de to www.main.com/de using 301 redirect? Redirect everything from www.example.de towww.main.com/de/page-url-of-old-size.html? So each link actually get own address. Is that necessary or will Google realize where the page belongs on new site even if I redirect everything to home page? Any other solution, maybe just set in Google webmaster tools the new domain or anything like that?

    Read the article

  • How do you enhance your websites speed without compromising the design and access?

    - by Thorn007
    How do you enhance your websites load speed without killing the design and accessibility? File compression, CDN, Gzip? What are the best tools for doing so? For example, Google has optimized their site without compromising the design. Also, many website can kill the purity of their images with compression. Is there a way, more or lest best practice, to increase speed without compromising the design and accessibility? Note: sorry for being so vague but I don't know how else to phrase this question.

    Read the article

  • How to Keep SEO Score from Dropping with Duplicate Content

    - by joeh0717
    I'm hoping that someone has a solution for what I'm trying to accomplish. I'm working on a travel agency web site and there's a "Overview" section for each cruise line. These overviews are located on the index page for each cruise line. Here's my issue: The company is creating a search engine that includes details on each cruise line. Their write-ups on each cruise line are great, so I'd like to include the overview they created for each cruise line, rather than having to create all new ones. However, I don't want duplicating their content to negatively affect the SEO scores of the pages they originally put this content on. It's gong to duplicate, since each page that's dynamically generated by their search engine is going to include a section about the cruise line (where I'd want to place the overview). Question: Is there any way that I can include these overviews (ideally, copying the exact HTML that they've already implemented) without the search engines indexing those particular code sections? I'd want the rest of the search result pages to be indexed...just not the section of each page that contains this duplicate code. I saw something about using a span class named robots-nocontent in Yahoo (not sure if this also applies to Bing) and googleon / googleoff tags in Google. Is this the best solution? I'm open to any suggestions, thanks!

    Read the article

  • Why does the homepage rank in exact match and not in broad anymore? [closed]

    - by Claude Pelanne
    My website was ranking on page one for "walking shoes reviews" in broad and exact. It has disappeared in the rankings for broad and yet still ranks in exact match. No indication in the Webmaster tools of anything wrong yet Google analytics shows no more activity. his occurred sometime after Panda and Penguin. Not sure which because I didn't check closely for a while, this is a fun site for me. I walk for the enjoyment and health benefits so I built this site.

    Read the article

  • How can I make a web browser view my .h file as text?

    - by drewbenn
    I want to post a .h file from a project I'm working on. I set a simple href link to it, like: <p>Click here to download the <a href=project_strings.h>strings file</a>. When I click on it, though, my web browser (Iceweasel 12) gives me a prompt to download the file, instead of just displaying it: Is there any magic I can add to the web page, or as a header to the file (that will still allow it to be included by a .c compiled with gcc), to get the .h file to be displayed in the web browser?

    Read the article

  • Show events AND pageviews in Google Analytics

    - by supertrue
    Each page on my site contains a file, and I have Google Analytics set up to track file download events. I would like to see what fraction of users who visit Page X download Page X's file. I can view number of events by page by clicking on Content » Events » Pages. But I can't figure out how to see both events and pageviews (or visits) at the same time. Visits and pageviews are not available in the Secondary dimension dropdown from the Events list, and Events are not available as a Secondary dimension in the regular traffic listing (Content » Site Content » All Pages). I want something like this: Page Pageviews Events 1. /section/mypage 1,000 123 2. /category/anotherpage 867 41 3. /about/download 88 7 Is there a way to get this in Google Analytics?—to view events and pageviews, by page, at the same time?

    Read the article

  • Drupal node access for anonymous users

    - by MrDresden
    I've never used Drupal before so this may be something that can easily be remedied, and that would be awesome. My problem is that a block, containing node information can't be viewed by anonymous users (unregisterd/not logged in), gives a "You are not authorized to access this content." message, but shows up for logged in users. The nodes that the block contains are events, so the block shows events for the next week. I've checked the users access settings but can't find anything that could possibly remedy this. I'm using drupal core 6.26, Event 6.x-2.x-dev, Event views 6.x-2.4 If anyone has any information, or solutions, I'd greatly appreciate it.

    Read the article

  • I want to create an e-learning website [closed]

    - by Viswa
    I want to create an e-learning website and host it. (Maybe after some time I want to add forms.) These are the things I know: java, jsp, servlet, html (not guru, almost beginner). I don't have experience in creating websites, I did my college project using jsp,servlet and jdbc. What are the things or technology I need to know before creating website. Is it possible to create a website by one person?

    Read the article

  • CSS not loading when site is viewed via Windows VPN

    - by Dreamling
    Internal site has recently been redesigned, but IE8 does not seem to be loading the new css rules only when viewed via VPN. I really have no clue what to look for. I can't reproduce the problem, but it's apparently affecting client for the last month. I've suggested: Reloading IE8 Checking Internet Permissions Flushing the cache I'm not really certain what direction to search for the answer. Is it likely to be a server permissions issue? a VPN connection issue? a rare ie8 CSS bug?

    Read the article

  • How do I get the root index page to redirect to a subdirectory without affecting SEO?

    - by paradroid
    I am reviving/reorganising my personal WordPress blog. It's using a URL that looks like this: http://mydomain.com/blog The webserver 301 redirects www.mydomain.com to mydomain.com. I want to use the blog subdirectory because I plan to add other parts to the site, with the blog only being one part of the site. However, at the moment there is nothing there but the blog, so I want to have the root index page redirect to the blog for the time being. I have been using this on the root index.html page to do the redirect... <meta http-equiv="REFRESH" content="0;url=./blog"></HEAD> ...but this seemed to have stopped the site being indexed by Google and Bing. How do I do this without affecting SEO? Also, what URL should I put in the sitemap.xml?

    Read the article

  • Joining and compressing all javascript files together - good idea?

    - by Tomáš Zato
    Curently, I avoid loading any unnecesary scripts on individual pages of my site. I have a class that remembers all javascript files that were requested during PHP processing and adds them to HTML. I was just thinking that I could merge the current set of files, save the result in special directory and let the browser download just one, big file. Since the number of possible combinations is not very high, I would end up with about 10 combined files for different pages. I've never seen that on any site. What are the reasons not to do it? I need very fast page load.

    Read the article

  • A relatively new blog seems to be getting very poor Google indexing

    - by Genadinik
    I have a new blog that is 2 months old. In the first few weeks, it was getting indexed nicely and my GoogleWebmaster reports were showing that it was getting crawled and began ranking for some terms. Then as I kept writing, the GoogleWebmaster report thinned out and showed less and less terms that this blog ranks for. Now there are only 4 terms with one of them being my name. Is there something I need to do to keep the old posts to remain indexed and crawled? Thanks, Alex

    Read the article

  • Spam problem through cPanel

    - by mrtunes
    On a new website, I publicly display an email address, [email protected]. Then I set up an email forwarder in my hostmonster cPanel, so that if the public email address ever became spam ridden then we can chop off the forwarding. However, the client received a spam message that looked like the following. To: client's personal email address (not the public address) Subject: domain.com opportunities Body: marketing junk The problem is that the "to" should have said [email protected]. I am now worried that the real email address was retrieved on the backend of hostmonster.

    Read the article

  • What effect does using itemprop="significantLinks" on anchors have for SEO and what is the ideal use?

    - by hdavis84
    I'm practicing application of microdata via http://schema.org. Anyone who's browsed the documentation there knows that there's a lot of need for improvement for more clear understandings on use for each property. My question on this post is more about the "significantLinks" property and how it effects SEO for on page, in content anchored text. Does anyone have any more information regarding whether its good to use for link optimization? I understand what schema.org means that it's to be used on "non-navigational links" and those links should be relevant to the current page's meaning. But will using this property hurt SEO or make SEO better for each page?

    Read the article

< Previous Page | 223 224 225 226 227 228 229 230 231 232 233 234  | Next Page >