Search Results

Search found 9721 results on 389 pages for 'quicktest pro'.

Page 219/389 | < Previous Page | 215 216 217 218 219 220 221 222 223 224 225 226  | Next Page >

  • How to enable the user to add background images to anchor links thought Wordpress admin panel? [closed]

    - by janoChen
    I have css selectors like this on in my style.css: .jimgMenu ul li.landscapes a { background: url(../images/landscapes.jpg) repeat scroll 0%; } What's the easiest way to enable the user to add background images to anchor links like the ones below? front-page.php: <div class="jimgMenu"> <ul> <li class="landscapes"><a href="#nogo">Landscapes</a></li> <li class="people"><a href="#nogo">People</a></li> <li class="nature"><a href="#nogo">Nature</a></li> <li class="abstract"><a href="#nogo">Abstract</a></li> <li class="urban"><a href="#nogo">Urban</a></li> <li class="people2"><a href="#nogo">People</a></li> </ul> </div> To illustrate: .jimgMenu ul li.landscapes a { background: url(<add background image>) repeat scroll 0%; } What that code would look like?

    Read the article

  • My Herokuapp is inaccessible from custom domain name

    - by picardo
    I have a Heroku app that is located at myapp.herokuapp.com. I have mapped a domain name to this app, using the A properties. I followed the instructions on Heroku's website to the letter, and it worked for a few days. Now when I try to access the site from the custom domain, it's timing out! On Chrome, I am getting "Oops! Google Chrome could not find that page!" message. I tried pinging the name as well, but I got this error: ping: cannot resolve yourhostname.org: Unknown host The app itself is working and I don't see any error messages from Heroku. Or from new Relic. What's going on here? Also tried running host and this was the error message: ;; connection timed out; no servers could be reached

    Read the article

  • How to make search engine to showing my location map? [duplicate]

    - by Lena Queen
    This question already has an answer here: What are the most important things I need to do to encourage Google Sitelinks? 5 answers I have a webpage that already listing on google maps. If i search with term "my web", search engine (google) only show my web. How to make search engine (google) show my web and google location on right as this scenario: For contact and portfolio page, how to make it so user can view some of my page beside my home page? Also, how to make google show my other links of my web?

    Read the article

  • Canonical tags for separate mobile URLs

    - by DnBase
    I have a Drupal website serving mobile pages from different urls (starting from /mobile). According to Google recommendations I should use the canonical tag to map desktop and mobile pages. Right now I did this in case I serve the same node (e.g: node/123 and mobile/node/123) but should I do this for other pages as well that are equivalent but share a different content? For example do I need to map the desktop and mobile homepages even if they don't have the same content at all?

    Read the article

  • Best Text-to-Speech Solution for my Website [on hold]

    - by Tim Marshall
    I'm working on the 'Ease of Access' section of my website with the options to increase the font-size displayed on pages to a minimum, invert colours and whatnot. I wish to implement a plugin which, if enabled by the user, to read content on my website. Presumably my best option is a website plugin, however there might be some programming I've not come across which allows the likes of PHP to read content. I'm not entirely sure how this all works.

    Read the article

  • strange 404 errors

    - by user1400532
    i have this website thinkmovie.in recently i enabled cloudfare along with maxcdn. When i look at my server logs, i see these strange 404 errors for many of the files. for eg: http://thinkmovie.in/img/content/15062012faith/thumbs/model_fai12e8th_latest_photoshoot_10.jpg But the actual url is http://thinkmovie.in/img/content/15062012faith/thumbs/model_faith_latest_photoshoot_10.jpg refer_url: http://www.thinkmovie.in/gallery/ It means the term "model_faith" is replaced by "model_fai12e8th" and one more http://thinkmovie.in/image.php/?offset=1&height=120&width=144&cropratio=1.2:1%E2%84%91=/img/content/07052012pranitha/pranitha_hot_in_saguni_movie_press_meet_0.jpg?offset=1&height=120&width=144&cropratio=1.2:1%E2%84%91=/img/content/07052012pranitha/pranitha_hot_in_saguni_movie_press_meet_0.jpg actual url http://thinkmovie.finalytics.in/image.php/?offset=1&height=120&width=144&cropratio=1.2:1%E2%84%91=/img/content/07052012pranitha/pranitha_hot_in_saguni_movie_press_meet_0.jpg?offset=1&height=120&width=144&cropratio=1.2:1&image=/img/content/07052012pranitha/pranitha_hot_in_saguni_movie_press_meet_0.jpg refer_url: http://www.thinkmovie.in/gallery/hotactress/album/pranitha_hot_stills_19012012pranitha/ {&image replaced by %E2%84%91} I'm not able to understand how this is happening. I checked my code server times. And I am not able to replicate this problem from my browser. Please help me.

    Read the article

  • Google Analytics recording event based on <a> title attribute

    - by rlsaj
    I am declaring: var title = (typeof(el.attr('title')) != 'undefined' ) ? el.attr('title') :""; and then have the following: else if (title.match(/^"Matching Content"\:/i)) { elEv.category = "Matching Content Click"; elEv.action = "click-Matching-Content"; elEv.label = href.replace(/^https?\:\/\//i, ''); elEv.non_i = true; elEv.loc = href; } However, using Google Analytics debugger this is not being recorded. Any suggestions? The complete function is: if (typeof jQuery != 'undefined') { jQuery(document).ready(function gLinkTracking($) { var filetypes = /\.(avi|csv|dat|dmg|doc.*|exe|flv|gif|jpg|mov|mp3|mp4|msi|pdf|png|ppt.*|rar|swf|txt|wav|wma|wmv|xls.*|zip)$/i; var baseHref = ''; if (jQuery('base').attr('href') != undefined) baseHref = jQuery('base').attr('href'); jQuery('a').on('click', function (event) { var el = jQuery(this); var track = true; var href = (typeof(el.attr('href')) != 'undefined' ) ? el.attr('href') :""; var title = (typeof(el.attr('title')) != 'undefined' ) ? el.attr('title') :""; var isThisDomain = href.match(document.domain.split('.').reverse()[1] + '.' + document.domain.split('.').reverse()[0]); if (!href.match(/^javascript:/i)) { var elEv = []; elEv.value=0, elEv.non_i=false; if (href.match(/^mailto\:/i)) { elEv.category = "Email link"; elEv.action = "click-email"; elEv.label = href.replace(/^mailto\:/i, ''); elEv.loc = href; } else if (title.match(/^"Matching Content"\:/i)) { elEv.category = "Matching Content Click"; elEv.action = "click-Matching-Content"; elEv.label = href.replace(/^https?\:\/\//i, ''); elEv.non_i = true; elEv.loc = href; } else if (href.match(filetypes)) { var extension = (/[.]/.exec(href)) ? /[^.]+$/.exec(href) : undefined; elEv.category = "File Downloaded"; elEv.action = "click-" + extension[0]; elEv.label = href.replace(/ /g,"-"); elEv.loc = baseHref + href; } else if (href.match(/^https?\:/i) && !isThisDomain) { elEv.category = "External link"; elEv.action = "click-external"; elEv.label = href.replace(/^https?\:\/\//i, ''); elEv.non_i = true; elEv.loc = href; } else if (href.match(/^tel\:/i)) { elEv.category = "Telephone link"; elEv.action = "click-telephone"; elEv.label = href.replace(/^tel\:/i, ''); elEv.loc = href; } else track = false; if (track) { _gaq.push(['_trackEvent', elEv.category.toLowerCase(), elEv.action.toLowerCase(), elEv.label.toLowerCase(), elEv.value, elEv.non_i]); if ( el.attr('target') == undefined || el.attr('target').toLowerCase() != '_blank') { setTimeout(function() { location.href = elEv.loc; }, 400); return false; } } } }); }); }

    Read the article

  • Bayesian content filter for vbulletin [on hold]

    - by mc0e
    I've been tasked with coming up with a tool to automatically flag some posts for moderator attention on a large vbulletin forum. It's not spam per se, but the task has a lot in common with the sort of handling that might be done by a spam protection plugin (a mod in vbulletin speak). There's only so much I can say, but the task does not involve bad users, so much as particular kinds of posts which the moderators need to be aware of. Filtering out user registrations and links is therefore not useful, and we are talking about posts by real human users. What I'm looking for is an existing bayesian classification plugin, or something that I can study to get an understanding of how to do the vbulletin side of the interface in order to build such a thing. Ie I'd need ways for moderators to list flagged posts, and to correct the classification of posts which have been mis-classified. Ideally I want a 3 way split with an "unsure" category in order to reduce what has to be reviewed to find any mis-classifications. Any pointers? I've searched around a bit, and so far what I've found has been more or less entirely targetted at intervening in sign-ups (mostly using stopforumspam), captchas, and use of external services like akismet which are spam specific. I'm also considering an external solution, which might be ableto be interfaced i

    Read the article

  • How do I prevent ISPs from killing downloads of files in mid-transfer?

    - by Gorchestopher H
    I run a small website with a few users, low traffic, mostly to share personal mp3 files with a small community. Depending on their ISP, my users can't always download or stream larger files. By larger I mean larger than 1MB. Essentially the host either stops sending, or the client stops receiving. One of the links along the connection chain simply ends its connection before the transfer completes Trace-route shows no connection issues. There are no connection issues with short transfers that don't take more than a few seconds. It's these 10 second transfers that just end up ending. Just doing a straight download with a direct link can yield this error if you have the wrong ISP. Strangely enough, this is most common with users with ISPs who are essentially independent providers that buy service via a fiber link. Unfortunately these providers aren't very knowledgeable, are unable to do any testing, and insist it's a problem with the host. I have gotten my host to transfer my site to different servers of their, to the same effect. Nearly identical sites (affiliate sites actually) experience no such issue. What can I be doing to further troubleshoot this matter? How can I prove that someone is dropping the ball, and identify who that party is? Can I do a 5Mb traceroute? EDIT Maybe I can clear up some misconceptions with my question: The files are not very large. They are simply over 2Mb. The users do not have "slow" connections, they are at least 5mbps. This "time out" happens very quickly, in the realm of 5 seconds, so I don't know if it's a timeout or not. The user often gets 1 or 2Mb in this chunk of time. I have tried streaming with a flash player. I have tried saving the target. Forcing the download. I have tried allowing the browser to stream the file. I have tried different browsers (FF, IE, Chrome). Users are able to download identical files when on different hosts.

    Read the article

  • how to include tags in permalinks of wordpress

    - by babua
    while using custom structure in my wordpress url , when i am trying to include tags, it shows me , it is showing errors , but when i add category , it reflects in url. i want that the tag gets included in custom url structure automatically , how can it be done using wordpress ... please help ... when i am adding /%tag%/ to custom structure field in wordpress admin , the url shows not found message.

    Read the article

  • How are certain analytics metrics (time on site, etc.) usually distributed?

    - by a barking spider
    I'm not sure if I've come to the right place to ask this question, but I'm gathering some information for a research project. We're trying to design an experiment that'll heavily involve web analytics, and I'm trying to figure out some sensible values of mean +/- standard deviation for the following visitor-level (i.e., visitor 1 spends 2 minutes on site, visitor 2 spends 1 minute -- mean 1.5 +/- 0.71...) metrics: time spent on site page views If time allowed, we would put up the sites and gather the information ourselves, but we have a grant deadline coming up. I realize that even though these the distributions of these quantities are probably going to be heavily skewed towards zero, we'll need some reasonable figures or estimates of these figures in order to do sample size calculations, etc. Anyway, I'm not sure where else I'd turn, and I certainly have had a difficult time finding these values in the prior literature. If someone could direct me to a paper with the right information, or if you have these figures on hand (perhaps taken directly from your logs!) -- that would be amazing, and I'd love to hear from you. Thanks in advance, and even though I'm not allowed to reveal too much, rest assured that this info'll be applied towards a good cause :)

    Read the article

  • Which tags to use for good SEO on the page

    - by Aaditi Sharma
    I have a event page, where it has the following items. Event Name Venue Name(s) {some cases go upto 5 or more venues} Event Info {Genre(s),Language,type(s)} Date(s) on which the event is. Event Description. Since, the Event name is unique, and present in the title, I am assigning <H1> to it. However, venue names are multiple, plus the same venue may be repeated across the page, along with dates. (Each)Event Info, is used a single time on the page Dates, are descriped in a styled manner using multiple spans, however, I am going to use a title on them. Event description is in <p> tag. So My question is which heading tags to use for a good symentic description and SEO. Also the title on the dates, which format should I keep the date in? (dd/mm/yyyy)?

    Read the article

  • Is the use of hashbang really a good idea? [on hold]

    - by user32642
    I've been working on a WordPress site lately that was design with hashbang or shebang in the dynamically generated URLs. After doing some research, I noticed that there was some preference by Google in their use and how it crawled the site. However, after I ran several sitemap generators and Screaming Frog SEO Spider, I realized that the only page being crawled was the index page. So now I am questioning the use of hashbangs. What do you think? Should I attempt to remove them? Or will it even matter? And does anyone know of a easy way to remove this? The site is www.modernvintage1005.com

    Read the article

  • Multiple domains for different products?

    - by alexandertr
    I have a website with software applications. Is it good for SEO to choose one keyword rich domain name for each of our software products or should we stick to a single domain? From a user's perspective I think it would be easier to remember a domain that is keyword rich as the user will instantly know what this product is for. But I have read articles that the latest trend in SEO is to stick to one domain for all of your products and invest on this single domain website. Is that true? What do you advise? Should I register a separate domain for each of our products or should I use only one single domain? Should I do a 301 redirect with a .htaccess to a single domain? And what about the sitemaps? Should I register all sites in Google Webmaster Tools and post a separate sitemap for each one of them? should my main site sitemap include all pages or should separate domains have their own sitemaps?

    Read the article

  • Looking for advice on B2B promotion [closed]

    - by IconicDigital
    Can anyone recommend affiliate networks that focus on B2B development. We are about to launch a UK job search engine that allows job boards to list their jobs on the engine. We have decided to keep the advertising in house, with the goal being of keeping the costs down. I was wondering if anyone could offer any advice on potential advertising routes that we could take. For example B2B affiliate networks, adwords etc. We are in the position of launching an empty site and ideally we would like to be attracting recruitment agencies or businesses to signup to either a free or paid account. They can then begin to populate the engine with job listings. An obvious choice so far would be to promote on networks like Linked In. Any ideas? Thanks

    Read the article

  • Help on PHP CURL script [closed]

    - by Sumeet Jain
    This script uses a cookie.txt in the same folder chmoded to 777... The problem i am facing is i hav many accounts to login... Say if i hav 5 accounts...i created cookie1.txt,cookie2.txt an so on.. then the script worked..with the post data But i want this to be always logged in and post data.. Can anyone tell me how to do this????? Code which works for login and post data is http://pastebin.com/zn3gfdF2 Code which i require should be something like this ( i tried with using the same cookie.txt but i guess it expires :( ) http://pastebin.com/45bRENLN Please help me with dealing with cookies... Or suggest how to modify the code without using cookie files...

    Read the article

  • simple sql group by custom groups question [migrated]

    - by alex
    imagine a mysql table that only has 2 columns, an id and a name of a color. with this query I know how many id's do I have for each color. SELECT color_name, count(id) FROM color_table GROUP BY (color_name); red:10 blue:5 yellow:3 green:1 my question is, is there a way I can specify to the "group by" some custom groups?? i mean, is there a query that results in this??: red:10 colors different than red: 9

    Read the article

  • Deleted files still accessible without www in url

    - by phlegma
    I have deleted all files and all hidden files off my server, there is nothing but log files which cannot be deleted. Ironically, files are accessible when nothing is there. Cache cleared, multiple browsers and computers/devices checked. Files show when I exclude "www" from the URL http://sarastringfellow.com/assets/photo/c.jpg http://www.sarastringfellow.com/assets/photo/c.jpg What does this mean?

    Read the article

  • Why is Cloudflare waiting for name servers for over 4 days?

    - by user29175
    I've registered for Cloudflare's free plan and have completed the process of redirecting the DNS as instructed, including changing the name servers. This was done 4 days ago. The problem is that cloudflare is giving me: "websites" -- "Finishing up. Waiting for your name servers to change to * Please allow up to 24 hours to complete this process (info)" "dashboards" -- "Analytics data could not be loaded. You do not have any initialized zones" I can see via traceroute that CloudFlare is the DNS to my site. Also, somehow this has messed up with my google analytics account so I have no idea if I get visitors to my site or not. What should be done to fix this?

    Read the article

  • Find last of match string automatically

    - by jowan
    I want to make id for entries as long as 7 digits.. while first entry is created, it will get id is 0000001 And my problem is i want to get id and add to 1 every time new entry is created.. I have a bunch of code and still confuse to implement it. $str_rep = "0000123"; $str_rep2 = "0005123"; // My character string can be like this $str_rep3 = "0009123"; // My character string can be like this $match_number= array(1,2,3,4,5,6,7,8,9); // I create array to do it automatically but it was not work. // I do it manually $get_str = strstr($str_rep, "1"); $get_str = strstr($str_rep2, "5"); $get_str = strstr($str_rep3, "9"); // Result echo $get_str . "<br>"; echo $get_str2 . "<br>"; echo $get_str3 . "<br>"; Thanks in advance

    Read the article

  • Suggest windows webhost provider for following requirements.

    - by op_amp
    Hi, We have a asp.net MVC3 based web app which uses SQL SERVER 2008 for database. Also, we have a client side desktop application which also uses SQL SERVER 2008. While developing the system, we are able to Sync tables using SQL SERVER Replication feature. Now, we want to host our site on a webserver but we are clueless about it. If anyone of you have a similar system working then please suggest a cheap but reliable webhost which supports Replication. Initially there will be approximately 10 or less clients who will perform replication 2 or 3 times a day. The size of the database will be less than 4GB for sure.

    Read the article

  • Domain mapping issues

    - by Nadya
    I have two domain names - .com & .co.uk bought with 123-reg and just one student Windows hosting pack associated with the .co.uk domain. The .com domain is the main one which people would be trying to access, so I just mapped the domain to the hosting this morning. The problem is that I would really like it to be functional by tomorrow morning and the usual waiting time is 24-48 hours. Is there point in stopping the process and trying with forward it with CNAME record instead, does it take less time? (I can just go back and do proper domain mapping during the weekend) Also, is there a possible way to check whether the domain mapping has been done correctly before these 24-48 hours? From some computers I get 404 Error on homepage.

    Read the article

  • Should I add rel nofollow to internal links which already have meta noindex?

    - by CamSpy
    Let's say I have a products page with listing producsts and the page has pagination. I would like the 1st page to have all the SE ranking weight so I decided to put meta noindex on the rest of the paginated pages (from page 2 to N). My common sense says that if I don't want pages to not get indexed, I shouldn't also pass link/PR juice to these pages. (Is that smart?) What happens if I set rel="nofollow" for all pagination links from page 2 to page N?

    Read the article

  • Webmail with option to change password for email account?

    - by arma
    Been testing out different webmail options to use (so far AfterLogic, Horde) And it seems that there is no options to change password for user. It's really bad thing that i have to go to server and manually change passwords for users. Is there any webmail solution that will allow me to change password, that also changes on server (as client). Or is it server setting i must use before? Or it is not possible? EDIT: Note that i have cPanel host.

    Read the article

  • What is the SEO-recommended method for using underscores and dashes in URLs that contain geographic locations?

    - by ElHaix
    In reading through this article: In Subfolder & File Names, Use Dashes, Not Underscores Good: Good: http://www.domain.com/sub-folder/file-name.htm Bad: http://www.domain.com/sub_folder/file_name.htm In my URL's, I may have one or two city names, ending with the province/state: Burnaby_New_Westminister-BC/[some search term]. My URL rules currently are defined such that everything after the dash is the prov/state. Some geographic locations already contain dashes: Notre-Dame-de-Grâce (in QC), which I would convert to ~/Notre_Dame_de_Grace-QC/ I thought of placing the prov/state after another "/", however in some cases the province/state name may not exist, thus ~/Notre_Dame_de_Grace/, so the first term after the domain name contains the geo location {city, city_name-state}. I am now revisiting this, and wondering if this rule set should change, and if so, what is the recommended way of implementing this? -- UPDATE -- After reviewing this video, I see that I should be using the dashes, rather than underscores. However since I still want to have my geo locations in the first URL section, is there anything wrong with using a double-dash separator - ie: /city-name--state/ ?

    Read the article

< Previous Page | 215 216 217 218 219 220 221 222 223 224 225 226  | Next Page >