Search Results

Search found 9728 results on 390 pages for 'zee pro'.

Page 144/390 | < Previous Page | 140 141 142 143 144 145 146 147 148 149 150 151  | Next Page >

  • How to display/add play button on youtube image?

    - by Zakir Sajib
    I have embedded the youtube image from youtube server which is "http://img.youtube.com/vi/0.jpg", now there is no play button shows up on middle of the image as we see usually any youtube videos! I tried to use the following code to get an image on top of the youtube image but it shows bigger picture, i know why <a class="fancybox" href="#video"> <img src="/wp-content/themes/mytheme/images/play_button.png" no-repeat width: 0px height:0px; style="background: url(http://img.youtube.com/vi/<?php echo $youtubeid ; ?>/0.jpg) transparent" width="180" height="150"/> <div id ="video"> // here is my embedded youtube usual code </div> both images shown in 180 x 150 size, but thats not what i want. I want youtube image will be shown in 180 x 150 size and play button image (play_button.png) will be display in middle of the youtube image in small size. Any clue in css or coding in php will be great favour.

    Read the article

  • CMS for Blog site

    - by Yau Leung
    I would like to create a blog site with features like Engadget. The editor can upload blog and albums while users can comment. I know it's even easier to use blogspot but it's blocked in China. I have tried Joomla before. It seems a bit slow even after removing most of the modules and the memcache plugin doesn't help much either. Is there any other option? Do I need other plugins to run WordPress as blog?

    Read the article

  • How to make this CSS design of words in headings look clean and well desinged? [closed]

    - by kacalapy
    I am trying to get the lipstick on the pig and not wearing my UI developer hat often is making this impossible. Can someone give me nice alternatives to the code below. this is what i have now. <style> .FirstLetter:first-letter{font-family: arial; font-size: 14pt; font-weight: bold;color:White; background:Blue; border:1px black solid; padding-top:8px; padding-left:8px; padding-bottom:3px;} .Spaced{letter-spacing: 5px;font-family: arial; font-size: 14pt; font-weight: bold;} </style> <div class="FirstLetter Spaced headerFont"> Executive Summary </div> Here is the ugly result of the above code- i am lookign to make the header section look better ONLY that's where the first letter is blue:

    Read the article

  • Facebook shared links not opening (Redirecting correctly) [on hold]

    - by Hammad
    I have been in a problem and after hours of searching find nothing useful to counter it. I have a website, and that website has RSS feed attached to facebook page. As I post the content to website it also appears on facebook page. But I have people complaining on my page that my posts don't open when they click the link to read the details. Since I am a Chrome user and didn't notice this happening for months. But as I checked it through firefox and Internet Explorer, I found the links shared actually don't open with these browswers. They only work in Chrome, means they are properly redirected in chrome browser and not in firefox and IE. Whenever I click on links of posts on my page through IE or firefox the url does not simply redirect to my website and I get to see nothing as if I am not connecting to Internet. When examining URL I see this: http://www.facebook.com/l.php?u=http%3A%2F%2Fbit.ly%2F112NVO9&h=EAQHDgTUv&s=1 Which shows that facebook is not redirecting the links properly. Moreover I also use link shortening service bit.ly to shorten my shared links. I have checked same problem exists even if I don't shorten my links. And I checked I am not alone, even the tech giant website mashable.com links also don't open in IE and FF from facebook, they only open in Chrome. From mobile phone the links don't open (redirected properly) even in chrome. Can anyone tell me what is the issue? Nothing much is written about it on Internet as no once has faced this problem. P.S: I have checked from different systems, the problem persists.

    Read the article

  • What framework & technology would you use to make a site like fancy.com [on hold]

    - by adriancdperu
    Im about to start a 3 month process to build something similar to fancy.com. What are your opinions about: best language + framework? (server - side) = I was thinking about LAMP and CakePHP important technological issues to consider when developing? MAU assumption = 1 million. Main features: Registration with Facebook Normal registration Search articles Like articles Post articles Comment on articles Suggest articles to friends via mail, Facebook, Twitter and about 3 or 4 more apis Ranking of articles as a cron job done every minute, many criterias, many rankings Follow users Datamining users to mail everyday with articles they have high probability of liking Operation tools for admins to add articles and close user accounts Mobile focus

    Read the article

  • several subdomains pointing to one folder. fasthosts problem

    - by David
    I have a asp.net website (e.g. www.website.com). The idea is that you will go to the url 'yourname.website.com' and my site will request the name from the sub domain, process it and change the content accordingly. So, I purchased my fasthosts hosting package and created a sub domain which points to a folder within my site. e.g. 'www.website.com/folder'. However, when i now go to the url 'yourname.website.com' it is immediately redirected to 'website.com/folder'. this means I cannot request the subdomain from the url because it has been lost. I have tried contacting these guys but they don't understand and keep going on about some sort of redirect script (although i don't see how any further redirects can solve my problem). Is there something I am missing here? Any suggests or help would be much appreciated!

    Read the article

  • How to parse JSON data from web more faster [closed]

    - by Kaidul Islam Sazal
    I have json inventory inventory.json on the server like this: [ { "body" : "SUV", "color" : { "ext" : "White diamond pearl", "int" : "Taupe" }, "id" : "276181", "make" : "Acura", "miles" : 35949, "model" : "RDX", "pic" : [ { "full" : "http://images1.dealercp.com/90961/000JNBD/001_0292.jpg" } ], "power" : { "drive" : "Front wheel drive", "eng" : "2.3L DOHC PGM-FI 16-VALVE", "trans" : "Automatic" }, "price" : { "net" : 29488 }, "stock" : "6942", "trim" : "AWD 4dr Tech Pkg SUV", "vin" : "5J8TB2H53BA000334", "year" : 2011 }, { "body" : "Sedan", "color" : { "ext" : "Premium white pearl", "int" : "Taupe" }, "id" : "275622", "make" : "Acura", "miles" : 40923, "model" : "TSX", "pic" : [ { "full" : "http://images1.dealercp.com/90961/000JMC6/001_1765.jpg" } ], "power" : { "drive" : "Front wheel drive", "eng" : "2.4L L4 MPI DOHC 16V", "trans" : "Automatic" }, "price" : { "net" : 22288 }, "stock" : "6945", "trim" : "4dr Sdn I4 Auto Sedan", "vin" : "JH4CU2F66AC011933", "year" : 2010 } ] here are two index, There are almost 5000 index like this. I parsed this json like this: var url = "inventory/inventory.json"; $.getJSON(url, function(data){ $.each(data, function(index, item){ //straight-forward loop if(item.year == 2012) { $('#desc').append(item.make + ' ' + item.model + ' ' + '<br/>' + item.price.net + '<br/>' + item.pic[0].full); } }); }); This is working fine.But the problem is that, this searching and fetching process is little bit slow as there are 5000 indexes already and it's increasing day by day. It seems that, it is a straight-forward loop to parse the data and a normal brute-force method. Now I want to know if there any time efiicient way to parse more faster.Any faster method to parse instead of straight-forward loop ?

    Read the article

  • How can I tell GoogleBot that a subdirectory is now a subdomain?

    - by cwd
    I had about a million pages of a catalog indexed under a subdirectory, and now that's moved to a subdomain. GoogleBot is crawling each one of them and getting a 301 redirect to the new location. Even though I have set up the redirect rule in the apache sites-enabled configuration file, (i.e. it's early on when apache does the redirect - PHP is not even getting loaded), even though I have done that, the server isn't handling the load well. GoogleBot is making around 5 requests per second, and on top of my normal traffic that is hiking up the CPU for a few hours at a time. I checked in Webmaster Tools and the corresponding documentation for a way to let Google know that the content had been moved from a subdirectory to a subdomain, but with little luck. Basically the most helpful thing I saw said to just send 301 headers for the new location. How can I tell GoogleBot that a subdirectory is now a subdomain? If that is not an option, how can I more efficiently send 301 redirects out for a particular subdomain? I was thinking perhaps the Nginx server but I'm not sure that I can run both Apache and Nginx side by side on port 80 for different subdomains.

    Read the article

  • Loading main javascript on every page? Or breaking it up to relevant pages?

    - by Kyle
    I have a 700kb decompressed JS file which is loaded on every page. Before I had 12 javascript files on each page but to reduce http requests I compressed them all into 1 file. This file is ~130kb gzipped and is served over gzip. However on the local computer it is still unpacked and loaded on every page. Is this a performance issue? I've profiled the javascript with firebug profiler but did not see any issues. The problem/illusion I am facing is there are jquery libraries compressed in that file that are sometimes not used on the current page. For example jquery datatables is 200kb compressed and that is only loaded on 2 of my website pages. Another is jqplot and that is another 200kb. I now have 400kb of excess code that isn't executed on 80% of the pages. Should I leave everything in 1 file? Should I take out the jquery libraries and load only relevant JS on the current page?

    Read the article

  • Impact on SEO of adding categories/tags in front of the HTML title [closed]

    - by Mad Scientist
    Possible Duplicate: Does the order of keywords matter in a page title? All StackExchange sites add the most-used tag of a question in front of the HTML title for SEO purposes. On Stackoverflow for example this is usually the programming language, so you end up with a title like python - How do I do X? This has obviously an enourmous benefit on SEO as the programming language is an extremely important keyword that is very often omitted from the title. Now, my question is for the cases where the tag isn't an important keyword missing from the title, but just a category. So on Biology.SE for example one would have questions like biochemistry - How does protein X interact with Y? or on Skeptics medical science - Do vaccines cause autism? Those tags are usually not part of the search terms, they serve to categorize the content but users don't use those tags in their searches. How harmful is adding tags that are not used in searches in terms of SEO? Is there any hard data on the impact this practise might have on SEO? The negative aspects I can imagine, but have no data to show that it is actually a problem are: I heard that search engines dislike keyword stuffing and this might trigger some defense mechanisms against that It's a practise associated with less reputable sites, a keyword in front that doesn't fit the actual title well might look suspicious to some users. It wastes precious space in the title shown in search results.

    Read the article

  • Packaging a web application for deploying at customer site

    - by chitti
    I want to develop a django webapp that would get deployed at the customer site. The web app would run in a private cloud environment (ESX server here) of the customer. My web app would use a mysql database. The problem is I would not have direct access/control of the webapp. My question is, how to package such a web application with it's database and other entities so that it's easier to upgrade/update the app and it's database in future. Right now the idea I have is that I would provide a vm with the django app and database setup. The customer can just start the vm and he would have the webapp running. What are the other options I should consider?

    Read the article

  • Looking for a modern payment processor which accepts adult sites

    - by JakeRow123
    I love how easy authorize.net is to use, but they don't accept adult sites. I am lanching an adult site soon but I can't find any good credit card processors I can use. Most sites use CCbill or Epoch but they are both terrible since the customer is redirected to their external site which also has an ancient 1990's look. It's not like authorize.net's API that you can query and get the result back as to whether the payment went through or not. This makes authorize.net blend seamlessly with the product. But since adult sites are against their TOS and also paypals, what is a good alternative? I am looking for one that won't redirect the user from my site, is big enough to be reliable and trustworthy and has fairly low rates. Any help is appreciated!

    Read the article

  • Can I test my affiliate ID on a dummy webpage without it being suspended?

    - by user359650
    I've recently applied for an Amazon affiliate program (which was accepted) as I'm planning on advertising books I read, on my website. Before going live with my website, I would like to: 1 -test the whole affiliate program to make sure it's working properly. 2 -buy the books I will review and promote on my website under my own affiliate program in order to get some cash back and therefore save money. To do so, I thought about setting up a simple HTML page (on the actual domain I applied for) which will just list the products I will buy before going live. That way I test, get some cash back, and don't expose my website (Brand, content...) before going live. Can I do this without having my account suspended by Amazon (i.e. won't Amazon think I only applied to the program to get some cash back, will Amazon be happy with receiving affiliate traffic from an almost empty website...) ?

    Read the article

  • Are bad backlinks causing thousands of 404 and 410 errors in webmaster tools?

    - by Natália
    Our webmaster tools account is showing 250.000 errors related with weird links from other sites. These URLs are comming mostly from non existent sites or are being generated directly by our website. Here some examples of these URLs: oursite.com/&q=videos+caseros+sexo+pornos+gratis&sa=X&ei=R638T8eTO8WphAfF2vG8Bg&ved=0CCAQFjAC%2F%2Fpage%2F2%2Fpage%2F3%2Fpage%2F4%2Fpage%2F3%2Fpage%2F4%2Fpage%2F3%2Fpage%2F4%2Fpage%2F5%2Fpage%2F4/page/3 Our site is a popular spanish adult site, yet we don´t have keywords which are being mentioned in this URL. Apparently this link comes from our site. Some more examples: oursite.com/&q=losmejoresvideosporno&sa=X&ei=U__8T-BnqK7RBdjmhYsH&ved=0CBUQFjAA%2F%2Fpage%2F2%2Fpage%2F3%2Fpage%2F2%2Fpage%2F3%2Fpage%2F2%2Fpage%2F3%2Fpage%2F4%2Fpage%2F3%2Fpage%2F2%2Fpage%2F3/page/4 Once again: not our queries, not out URLs. oursite/tag/tetonas We think that it might be other site, which is having a policy of extremely bad SEO based on other sites branding and keywords usage: thirdsite/buscador/tetonas-oursite The question is: if other sites are generating these URLs, how can we prevent this? Why the tag is being generated if no link was added to the other site? What should we do with these errors? 301? 410 gone? I have read all similar Q&A here but none of them seems to solve our problem. It is not likely to be a bad ad (Inspected them all). Maybe some all content which google decided to recrawl suddenly? Maybe third parties bad SEO policy? Maybe all of them?

    Read the article

  • Google page events monitoring and analysis

    - by Homunculus Reticulli
    I have read the Google page event documentation, but I am not sure I understand it correctly. I am new to Google analytics, and I have two questions: Once I have google analytics enabled for my site (i.e. I have inserted the tracking code in my pages etc), do I need to set anything else up (at the Google end - i.e. in my Google analytics account) It is not clear to me how the event data particularly, relating to how the data can be aggregated and analyzed. For instance, if I want to track an event under category category for click action action, I will use the following code snippet: <a href="some-uri.htm" onclick="_gaq.push(['_trackEvent', 'category', 'action', 'label']);">Do Something</a> For the sake of simplicity, lets say I am interested in monitoring click events in my header and footer, and I want to find which pages the header and or footer is clicked most often. How would I set things up so that I can analyze the header/footer clicks aggregated at the page level?

    Read the article

  • What should filenames and URLs of images contain for SEO benefit?

    - by Baumr
    We know that good site architecture usually looks like this: example-company.com/ example-company.com/about/ example-company.com/contact/ example-company.com/products/ example-company.com/products/category/ example-company.com/products/category/productname/ Now, when it comes to Google Image search, it is clear that the img alt tag, filename/URL, and surrounding text (captions, headings, paragraphs) have an effect on ranking. I want to ask about the filename of the images that we should use (e.g. product-photo.jpg). ...but first about the URL: Often web developers stick all images in a single folder in the root: example-company.com/img/ — and I have stopped doing that. (I don't want to get into it, but basically, it seems more semantic for images which make up part of the content at each sub-directory) However, when all images appear in a folder, I feel that their filename needs to reflect what they are a bit more than usual, for example: example-company.com/img/example-company-productname-category.jpg It's a longer filename than just product.png, but as long as it's relevant, I see no problem with regards to SEO (unless you're keyword stuffing), and it could even help rank for keywords: "example company" "productname" "category" So no questions there. But what about when we have places images in the site architecture we outlined at the beginning? In other words, what if image URL paths look like this: example-company.com/products/category/productname/productname.jpg My question is, should the URL be kept short like above and only have the "productname" (and some descriptive keywords) as part of it's filename? Or, should it also include the "example-company" and "category"? Like so: example-company.com/products/category/productname/example-company-category-productname.jpg That seems much longer, and redundant when we look at the URL, but here are a few considerations. Images are often downloaded onto computers, and, to the average user, they lose their original URL and thus — it isn't clear where they came from. Also, some social networks, forums, and other platforms leave the filename intact when uploaded. (Many others rewrite it, for example, Pinterest and Facebook.) Another consideration, will this really help (even if ever so slightly) rank in Google Image Search, or at least inform Google that the product is something specific to the "example-company"? For example, what if this product can only be bought at this store and is the flagship product? In addition to an abundance of internal links to this product page, would having the "example company" name and "category" help it appear in "example company" searches? In other words, is less more?

    Read the article

  • SVN Checkout folder as local webroot

    - by Shredder
    I have XAMPP installed and running. I have an SVN working directory (WD) on my local that checks out from the repository. I set up a virtual host in xampp to point to my WD, but my browser (FF) gives me a 500 http status error: Either the server is overloaded or there was an error in a CGI script. When I place a regular folder in the same location with the WD and switch names, it works fine.. Can I not use an SVN working directory as a web root folder?

    Read the article

  • Is AdWords ad blocked from top spots of SERPs until it is reviewed?

    - by Omeoe
    I have an AdWords ad and a keyword with a Quality Score of 10. Inferring from CPC from actual clicks, max CPC is set way beyond that of the third advertiser from the SERP for this keyword (there are three ads in the top). Still the ad is shown on the 4th spot which located either on the right or at the bottom of the SERP. The only catch is that the ad's status is "under review". Is it the reason why it's blocked from the top spots?

    Read the article

  • robots.txt not updated

    - by Haridharan
    I have updated some url's and files in robots.txt file to block url's and files from google search results but, still files displaying in search results. As per a suggestion from a site I tried to update the robots.txt by below steps. In Google Webmaster tools, Health - Fetch as Google - type the url and click the fetch button. but, still files displaying in search results. Note: in Google Webmaster tools, Health - Blocked URL's - robots.txt file - downloaded date looks two dates back.

    Read the article

  • Why Is Another Domain Resolving To My IP Address?

    - by Andrew
    I'm not really sure if this is something that I should worry about... I'm currently renting a dedicated server which is hosting a website I've created. The domain of the website was registered with GoDaddy. After submitting a sitemap to Google several months ago, I've noticed that another domain name is resolving to my IP address. This means that every page on my website is actually accessible from another domain. As far as I can tell, the other domain name is meaningless to me, so I'm not sure if this is something I should worry about or not. Is this a residual DNS record from another site that is probably no longer in use? Is it important from the standpoint of either security or SEO? My website is a .com which will later serve e-commerce purposes. The other domain has a top-level domain of st. It's the first one of those that I've encountered. Many thanks in advance!

    Read the article

  • Tracking AdWord ads with different text in Google Analytics

    - by at01
    I'm trying to see how the text in my Google AdWords ads affects my metrics in Analytics. I have auto-linking enabled, so I figured I would be able to automatically see this in Analytics. Unfortunately, if I try to add a second dimension of Traffic Sources-Ad Content, the metrics are only split by the ad's Headline. Most of my tests are changing only the ads' descriptions... So I guess I need to add a tracking parameter like ?campaign=special_text to my URLs? Or is there a way to see the ads split by ad descriptions? Should I add the full suite of utm_campaign/utm_medium/etc parameters? What's the proper way to track these ads which are mostly similar except the ad descriptions?

    Read the article

  • Wordpress Queue like Tumblr?

    - by Michael Hopkins
    Hi. Is there a way to give Wordpress the queue functionality that Tumblr has? Tumblr's queue, for those who don't know, is a way to space posts out without assigning specific post dates. For example, a Tumblr queue might be set to post every four hours between 9am and 5pm. Tumblr would drop the front post in the queue at 9am, 1pm and 5pm every day. Posts are added to the queue by clicking "add to queue" instead of "publish." It's quite simple. How can this feature be added to Wordpress?

    Read the article

  • What to do with random pages after a 301 redirect?

    - by Alex
    Hello, I did a standard 301 redirect for a domain, but the original domain has about 300 pages that have some strength. It doesn't make sense to make them all point back to the new home page because the individual pages are about some topics. Also, there aren't the same pages in the new domain, so where should the original random pages redirect to? I would like to have them rank for the same topics they used to, but without having the original domain giving them strength, they will just stop ranking and die off. What should I do? Thanks, Alex

    Read the article

  • 301 Redirects for regional variants of a homepage

    - by Adam Jenkin
    I am planning on implementing a website which has regional homepage variants. For Example: mycompany.com/europe mycompany.com/us The rest of the site is region agnostic and content will continue such as: mycompany.com/news mycompany.com/about-us etc For homepage (.com) requests, I plan on redirecting users to the correct homepage variant (via 301). If I cannot determine the correct one, I will fallback to redirecting them to the US homepage (/us). From an SEO point of view, firstly is this ok? or should I be doing anything additional to this for making search engines aware of the regional differences? As crawlers are region agnostic, I plan on directing them to the US page with a 301, or should I have something on the .com page which they use? Being that the regional homepage's will likely be the most visited pages, they should show up in result sitelinks when searching for mycompany (which I think is a good thing). Apologies for the slightly open question - I know anything SEO related is more opinion/best practice than fact but am purely looking for advice.

    Read the article

  • CSS alignment differs per page, cant find reason [migrated]

    - by Floran
    I list products on my homepage and on a company details page. I use the exact same HTML, but for some reason the product appears different: The productname is "Artikel 1". Here the product is displayed correctly: http://www.zorgbeurs.nl/ Notice how the green price area is right below the product. But here: http://www.zorgbeurs.nl/bedrijven/76/mymedical the green price area is all the way at the bottom of the page. Why?

    Read the article

< Previous Page | 140 141 142 143 144 145 146 147 148 149 150 151  | Next Page >