Search Results

Search found 9721 results on 389 pages for 'quicktest pro'.

Page 180/389 | < Previous Page | 176 177 178 179 180 181 182 183 184 185 186 187  | Next Page >

  • preview form using javascript in popup

    - by user1015309
    please I need some help in previewing a form in popup. I have a form, quite big, so I added the option of preview to show as popup. The lightbox form popup works well, but the problem I now have is function passform ()passing the inputs(textfield, select, checkbox, radio) into the popup page for preview on Click(). Below are my javascript and html codes. I left the css and some html out, because I think they're not needed. I will appreciate your help. Thank you The Javascript function gradient(id, level) { var box = document.getElementById(id); box.style.opacity = level; box.style.MozOpacity = level; box.style.KhtmlOpacity = level; box.style.filter = "alpha(opacity=" + level * 100 + ")"; box.style.display="block"; return; } function fadein(id) { var level = 0; while(level <= 1) { setTimeout( "gradient('" + id + "'," + level + ")", (level* 1000) + 10); level += 0.01; } } // Open the lightbox function openbox(formtitle, fadin) { var box = document.getElementById('box'); document.getElementById('shadowing').style.display='block'; var btitle = document.getElementById('boxtitle'); btitle.innerHTML = formtitle; if(fadin) { gradient("box", 0); fadein("box"); } else { box.style.display='block'; } } // Close the lightbox function closebox() { document.getElementById('box').style.display='none'; document.getElementById('shadowing').style.display='none'; } //pass form fields into variables var divexugsotherugsexams1 = document.getElementById('divexugsotherugsexams1'); var exugsotherugsexams1 = document.form4.exugsotherugsexams1.value; function passform() { divexugsotherugsexams1.innerHTML = document.form4.exugsotherugsexams1.value; } The HTML(with just one text field try): <p><input name="submit4" type="submit" class="button2" id="submit4" value="Preview Note" onClick="openbox('Preview Note', 1)"/> </p> <div id="shadowing"></div> <div id="box"> <span id="boxtitle"></span> <div id="divexugsotherugsexams1"></div> <script>document.write('<PARAM name="SRC" VALUE="'+exugsotherugsexams1+'">')</script> <a href="#" onClick="closebox()">Close</a> </div>

    Read the article

  • Whats the Quickest and Cheapest Solution to setup a Affiliate Program for an Online Product?

    - by szahn
    I have a simple HTML landing page setup for an online product I want to sell. This product is a hardcover book. I want to be able to allow other people to setup their own landing pages and make a percentage of the sale from their site. What are some good payment processors or payment gateways that make setting up an affiliate system easy and fast? Clarification - When someone purchases an item, I want (whatever the payment processor is) to automatically route a percentage of that payment to the affiliate and the rest to the original author.) Are there any payment frameworks that already do this? I've found a few sites that let you do this, but they seem to restrict you to digital purchases only. However, my sites is selling a ship-able product and the affiliate system needs to support this.

    Read the article

  • client website compromised, found a strange .php file. any ideas?

    - by Kevin Strong
    I do support work for a web development company and I found a suspicious file today on the website of one of our clients called "hope.php" which contained several eval(gzuncompress(base64_decode('....'))) commands (which on a site like this, usually indicates that they've been hacked). Searching for the compromised site on google, we got a bunch of results which link to hope.php with various query strings that seem to generate different groups of seo terms like so: (the second result from the top is legitimate, all the rest are not) Here is the source of "hope.php": http://pastebin.com/7Ss4NjfA And here is the decoded version I got by replacing the eval()s with echo(): http://pastebin.com/m31Ys7q5 Any ideas where this came from or what it is doing? I've of course already removed the file from the server, but I've never seen code like this so I'm rather curious as to its origin. Where could I go to find more info about something like this?

    Read the article

  • In Linux, which tools are free to use to make Web site mockups?

    - by user11173
    I am using Ubuntu/Fedora. Which available mock-up builders i can use before making a website? Follow up: Adobe AIR for Linux is no longer supported. To access older, unsupported versions, please read the AIR archive. Different operating system? Downloaded: http://www.balsamiq.com/download Direct Links Mockups for Desktop: Cross-Platform: MockupsForDesktop.air Windows: MockupsForDesktop.exe Mac OSX: MockupsForDesktop.dmg Linux 32bit: MockupsForDesktop32bit.deb Linux 64bit: MockupsForDesktop64bit.deb Windows with Adobe Air bundled: MockupsForDesktopInstallerWin.zip (for offline installations).

    Read the article

  • How to show the right country domain in Google Places?

    - by Baumr
    Background A site has multiple ccTLDs: example.com for the US, example.co.uk for UK users, example.de for Germans, etc. Googling for certain city keywords will return rich snippets with a list of Google Places: Problem When searching on Google Germany, the domain for US users (example.com) appears instead of the corresponding ccTLD (example.de). This is not good user experience, as users would most likely like to book on a site localized for them (e.g. language and currency). Question What solutions are there? Is it possible to return different ccTLDs in rich snippets for Google searches in Germany/UK? Ideas Would implementing the hreflang annotation resolve this? What about entering multiple corresponding URLs in the structured data markup?

    Read the article

  • Disallow robots.txt from being accessed in a browser but still accessible by spiders?

    - by Michael Irigoyen
    We make use of the robots.txt file to prevent Google (and other search spiders) from crawling certain pages/directories in our domain. Some of these directories/files are secret, meaning they aren't linked (except perhaps on other pages encompassed by the robots.txt file). Some of these directories/files aren't secret, we just don't want them indexed. If somebody browses directly to www.mydomain.com/robots.txt, they can see the contents of the robots.txt file. From a security standpoint, this is not something we want publicly available to anybody. Any directories that contain secure information are set behind authentication, but we still don't want them to be discoverable unless the user specifically knows about them. Is there a way to provide a robots.txt file but to have it's presence masked by John Doe accessing it from his browser? Perhaps by using PHP to generate the document based on certain criteria? Perhaps something I'm not thinking of? We'd prefer a way to centrally do it (meaning a <meta> tag solution is less than ideal).

    Read the article

  • How should I track multi-valued page attributes (e.g. tags) using custom variables?

    - by Simon
    Our pages can each have many tags, e.g 'football', 'sms', 'nsfw', etc.. wich we would like to track in google analytics. We're already tracking things like category using google analytics custom variables. We've used three of the five available slots so far. How can we track tags the same way? If we just mush them all together - e.g. 'football, sms, nsfw' then can we track the ones that are tagged 'football'? What's the right way to track multi-valued page attributes using custom variables?

    Read the article

  • What percent of visitors should click on the next page before you enable prefetching?

    - by Kevin Burke
    Mozilla Firefox and Google Chrome support prefetching via an HTML tag: <!-- in chrome --> <link rel="prerender" href="http://example.org/index.html"> I suppose it is always worthwhile to include this tag if 100% of users on a page click on the "Next Page" button or similar, and never worthwhile to include it if only 2% or 3% of users visit the following page. At what percent of clicks should you turn on prefetching of the next page? 65%? Also, does the calculus change if the current page is HTTP and the next page is HTTPS?

    Read the article

  • Google analytics/adwords account and leaking of private data

    - by Satellite
    I am frequently asked to log into clients google analytics and adwords accounts. If I forget to log out before visiting other google properties (google search, youtube etc), this leaves tracks of my views/searches etc, exposing my activities to the client. Summary: Client gives me access to their Google Analytics / AdWords account I log into clients Analytics account and do some stuff Then in another tab I perform some related google searches to solve some related issues Issues solved, I then close the Analytics tab I then visit google.com, perform some unrelated searches I then visit YouTube, view some unrelated videos All Web and YouTube searches are recorded in clients google account, thus leaking potentially sensitive data Even assuming that I remember to log out correctly at step 4 (as I do 95% of the time), anything I do at step 3 is exposed to the client. I would be surprised if this is not a very common issue. I'm looking for a technical solution to ensure that this can never happen. Any ideas?

    Read the article

  • Font displays differently in Firefox vs. Chrome

    - by Goro
    It seems that my menu bar is displayed with a different font stretch in Firefox than it is in Chrome. See the following: Here is the CSS applied to this element: font-variant: small-caps; font-size:13px; letter-spacing: 0px; font-family: Arial; font-stretch: normal; text-decoration: none; As far as I can tell everything regarding that font is exactly the same, yet they still display differently (see pic). Any ideas? Thanks,

    Read the article

  • Why do different browsers return different search results at Google and how can I prevent it?

    - by Sei
    I am running some websites and am constantly checking keywords' rankings on google.com. And it is really important for us to see the organic search result without logging in or setting a specific location. Since this morning my colleague and I have checked the same ranking on both IE and firefox, the result surprisingly is very different (it almost feel like IE was logged in because the ranking is much higher, while in reality it is not). I have changed computer and the same problem occurred. It did not happen before. Can anyone tell me why is it?

    Read the article

  • Any mobile-friendly Credit Card billing solutions for mobile sites similar to Bango?

    - by Programmer
    Are there any mobile-friendly Credit Card billing solutions for mobile sites similar to Bango? The advantages of Bango I have seen compared to regular Credit Card solutions that make it considerably "mobile-friendly" are: 1) It does not require the user to enter their full name and billing address to make a payment. The user is only required to enter their Credit Card number, expiration date, and CVC code (if they are in the U.S., they will also have to enter their Zip Code). That is significantly less input than is normally required for Credit Card payments, which is a big plus on small mobile key pads. After a user makes an initial Credit Card payment, their details are stored by Bango, and the next time the user needs to make a payment with the same Credit Card, they just have to click a single link and it processes the payment on their stored Credit Card. Needless to say, this is very convenient for mobile users as it is analogous to Direct Carrier Billing as far as the user is concerned since they won't need to input any details. The downside with Bango is that their fees are higher than others, all payments must be processed via their site and branding, there is a high minimum ($1.99) and a low maximum ($30) on how much you can charge users, and you need to pay a monthly fee on top of the high transaction costs. It is due to the downsides mentioned above that I am looking for an alternative solution that also does the advantages 1) and 2) above. Is there anything like that? I looked at JunglePay and they do neither 1) nor 2).

    Read the article

  • Redirect/Rewrite Subdomain to Subfolder

    - by Laurent Ho
    I'm trying to redirect a subdomain to a subfolder e.g. forums.domain.com to www.domain.com/forums Note that I started the forums in the subfolder format but worried that members might mistakenly try to access the forums using the subdomain format. RewriteCond %{HTTP_HOST} ^(www\.)?forums\.domain\.com RewriteRule .* /forums [L] From what I read the codes above should work through .htaccess, but do I still need to create a DNS A record to point to the IP address of the server?

    Read the article

  • Is it a good idea to add robots "noindex" meta tags to deep low content pages, e.g. product model data

    - by Cognize
    I'm considering adding robots "noindex, follow" tags to the very numerous product data pages that are linked from the product style pages in our online store. For example, each product style has a page with full text content on the product: http://www.shop.example/Product/Category/Style/SOME-STYLE-CODE Then many data pages with technical data for each model code is linked from the product style page. http://www.shop.example/Product/Category/Style/SOME-STYLE-CODE-1 http://www.shop.example/Product/Category/Style/SOME-STYLE-CODE-2 http://www.shop.example/Product/Category/Style/SOME-STYLE-CODE-3 It is these technical data pages that I intend to add the no index code to, as I imagine that this might stop these pages from cannibalizing keyword authority for more important content rich pages on the site. Any advice appreciated.

    Read the article

  • Multiple URL's going to same page - Kosher for Google?

    - by Ashoka15
    I hear conflicting answers from people about this, and I'm a developer by trade, and my SEO knowledge is not what it should be. Here's my situation: I run a website that lists hotels, restaurants, bars, shops, etc for a small Asian beach town. Lots of establishments here are hotels with a restaurant and bar, as well as restaurants that are also bars. As en example, a Mexican restaurant that also functions as a full cocktail bar. I first set it up so each establishment has one page, but can create multiple pages based on their other areas of business. This forces people to create TWO listings under the same name, and most just add the exact same information onto each page, making things redundant. I am re-arranging the database so that a establishment has only ONE listing (one unique page referenced by the unique code '12345ABCDEF') that is accessible from browsing under "Restaurants" and "Bars", and has the URL structures: site.com/dining/mexican/12345ABCDEF/business-name.html site.com/bars/cocktail_bars/12345ABCDEF/business-name.html I could easily simplify the URL to just the unique code and name: site.com/12345ABCDEF/business-name.html But, I found that Google has parsed by URL structure and lists like this on their SERP: Home > Dining > Mexican With each pointing to the default page for homepage, restaurants and Mexican restaurants. If I simplify the URL structure, will I lose these associations? Could Google also be picking up this structure from my breadcrumb trail at the top of the page? What is the best way to set up URL's on these pages so I am not penalized by Google for having identical information on two URL's, while still being able to have places show up as they did with the old system?

    Read the article

  • html5 based advertisement guidelines

    - by picus
    I want to experiment with the idea of an html based ad that utilizes my company's search API, is anyone here aware of any rules or documentation (general or per network) that explains the guidelines for creating such ads - ie markup, delivery etc - note this is not a question on how to use my company's api, I already know how to do that. For example, I would like to access the api with jsonp, probably via jquery? Can this be done? Would I host the ad and have loaded via an iframe? I just don't know these things. It is all so new to me... I... I'm scared. Actually, I'm not. However, I would like to know. Thanks in advance.

    Read the article

  • Is there a limit of emails/pictures per Gravatar account?

    - by Steve Taylor
    I'm building a site to connect patients to doctors. Each doctor will have a profile picture. I'm quite happy to manually maintain the profile pictures as there won't be that many doctors nor will they have a need to change their picture very often, if at all. I thought of using Gravatar to host all these profile pictures. The idea is to create a single Gravatar account then keep adding email addresses to it in the form [email protected] and associating each one with a new image. Does anyone know, however, if I will run into any per-account limit? If so, it wouldn't be feasible because I would end up with a bunch of Gravatar accounts instead of just the one.

    Read the article

  • Is Azure Compatible with JPEG XR?

    - by Shawn Eary
    I just put an F#/MVC app into a Windows Azure solution as a Web Role. Before migration, my JPEG XR (*.WDP) files were getting displayed on the client in IE9 without issue via my local and hosted sites. Now, after migration into Windows Azure, my JPEG XR files neither get displayed in my local Windows Azure compute emulator nor do they get displayed when they are deployed to http://*.cloudapp.net. Is there some sort of conflict with Widows Azure and (JPEG XR) *.wdp files? If so, what is the accepted best practice for overcoming this conflict?

    Read the article

  • SEO with duplicate content

    - by user16831
    I have a nature photography site with multiple types of photo galleries. Each photo and associated caption on my site appears in several galleries. For instance, a photo of a goldfinch that was taken on a trip to New Mexico in 2008 will appear in the "goldfinch.php" gallery, in the "finches.php" gallery, and in the "New_Mexico_2008.php" gallery. This duplication is useful for my site visitors - User A may want to see goldfinch photos, whereas User B wants to see photos from New Mexico - but I am concerned about the SEO implications. The typical suggestions to deal with duplicate content, such as 301 redirects and canonical tags, probably won't work in this case, because the page content is substantially different (ranging from ~1% to ~90% duplication, depending on the specific example chosen). The obvious solution to me would be to edit robots.txt to only allow search engines to crawl one type of gallery - for instance, if they crawled only the galleries organized by species(e.g. goldfinch.php), all the photos on my site would be found exactly once. However, the Google content guidelines recommend against blocking crawler access to duplicate information. Should I go ahead and use robots.txt anyway? Or is there a better solution?

    Read the article

  • setting up freedns with an existing domain

    - by romeovs
    I've been running a webserver off of a pc at a static IP succesfully for the past 5 months. recently however, I've moved into another appartment and my ISP only provides a dynamic IP (my IP changes from time to time). I'm not an internet genius but I was thinking to fix this by using a Dynamic DNS provider. So I got on the web and found freedns. I'm a bit confused about how to set up everything though. I've managed to succesfully install the IP updater daemon on my web server. Then, in my registrars control panel, I set the NS records to point at ns1 through ns4.afraid.org (removing the old NS records). I'm not certain what I should do with the A records though (for now they are still pointing to the old static IP address). I have A records for www, blog, irc, etc. but I cannot point them at my new IP address, because it isn't Could someone explain this in the clearest possible sense (perhaps elaborating on what happens at each step of the DNS process). I never really knew what the A records are for anyway. (note that I haven't really found any documentation at the freedns website, or on google)

    Read the article

  • building a website

    - by Ant
    A couple of my friends run a business and they asked me to build them a public website. It will only be used for information about the company with soe pictures. No transactions will be involved. Right now I work for a company where I build internal websites, and do alot of backend programming in C#. I understand html, css, jquery, etc. so I feel like I am completely capable of building a website for them. However, I do not know all the basic knowledge to building one. For example, where should we host the files, what type of security issues do I need to be aware of, what's the best software to use for developing websites (I use visual studio at work), where can I find some design techniques, etc. Any help is appreciated.

    Read the article

  • If an visitors IP address contains "google" or a similar keyword, does this mean they were a crawler?

    - by Roscoe
    Hi, I have a huge list of IP addresses recorded from various visitors to a website. A huge amount of the visitors, in some months over 70%, came from IP addresses that contained keywords such as google, yahoo, bot, crawler, etc. Does this mean that those users were infact search engine crawlers? If so, why are their so many crawlers in my visitor records in comparison to genuine human visitors? (and if not what's the explanation?) Thanks in advance.

    Read the article

  • Estimate of Hits / Visits / Uniques in order to fall within a given Alexa Tier?

    - by Alex C
    Hi there! I was wondering if anyone could offer up rough estimates that could tell me how many hits a day move you into a given Alexa rank ? Top 5,000 Top 10,000 Top 50,000 Top 100,000 Top 500,000 Top 1,000,000 I know this is incredibly subjective and thus the broad brush strokes with the number ranges... BUT I've got a site currently ranked just over 1.2M worldwide and over 500k in the USA (http://www.alexa.com/siteinfo/fstr.net) Pretty cool for something hand-built on weekends (pat self on back) I was applying to an ad-platform and was told that their program doesn't accept webmasters who have an Alexa rank of greater than 100,000. (Time to take back that pat on the back I guess). I know that my hits in the last 30 days are somewhere on the order of 15,000 uniques and 20,000 pageviews. So I'm wondering how much harder do I have to work to achieve my next "goals"? I'd like to break into the top million, then re-evaluate from there. It'd be nice to know what those targets translate into (very roughly of course). I imagine that alexa ranks and tiers become very much exponential as you move up the ranks, but even hearing annecdotal evidence from other webmasters would be really useful to me. (ie: I have a site that is ranked X and it got Y hits in the last 30 days) Thanks :) - Alex

    Read the article

  • Tips for managing internal and external links using WordPress [closed]

    - by keruilin
    So I'm looking for ways to optimize my site for user and search engine purposes. I've read several articles and looked at several different plugins. To say the least, I'm thoroughly confused as what are the best practices for managing internal and external links. Here is a list of some of my questions: Which internal links should be set to "nofollow"? Which external links should be set to "nofollow"? To what degree does actively managing links contribute to your PR? Should you use "nofollow" blindly on all links in comments? If a link to an external site is broken (404 or whatever), should you "nofollow" that link? What about "noindex"? As you can see, lots of questions. I'm hoping that you experienced webmasters can give a newb some best-practice advice.

    Read the article

  • Is a subdomain per service a good idea for SEO?

    - by Kennie R.
    I am creating a site with quite a few services, such as a free account service, and of course a subdomain for my site's blog and then for article base and other related services, would having them all on subdomains be a good idea? Are there any caveats you are aware of in existing search engines for this? I believe mapping foo.example.com to example.com/foo to provide an alternative just in case is a good idea for sitemaps, I like to keep things clean.

    Read the article

< Previous Page | 176 177 178 179 180 181 182 183 184 185 186 187  | Next Page >