Search Results

Search found 9721 results on 389 pages for 'quicktest pro'.

Page 184/389 | < Previous Page | 180 181 182 183 184 185 186 187 188 189 190 191  | Next Page >

  • Need a really simple client manage script to deliver graphics and revisions, please help?

    - by Mark R
    I am looking for a very simple client management script. The process flow of the script should be: Client orders (paypal etc) while giving specs on what they need given login details and thanked for their order backend for them consists of a 2 way communication. They ask questions we answer. We also upload the graphics here where they either accept them or as for revision. process complete. Now I cannot for the life of me find something as simple as this. It seems all the scripts out there are way too complicated. Does anyone know of one I can use to do this?

    Read the article

  • Configure htaccess to show index.php as the default page instead of permissions error

    - by Jan De Laet
    Having a problem with my .htaccess. I have this to secure all my documents: Order Deny,Allow Deny from all Allow from 127.0.0.1 <FilesMatch "\.(htm|html|css|js|php)$"> Order Allow,Deny Allow from all Allow from 127.0.0.1 </FilesMatch> Now everything works fine except that the index page of www.mysite.com doesn't work and gives me the notification: You don't have permission to access / on this server. How can you fix this? If there stands www.example.com/index.php it works but if you surf to www.example.com I get this message.

    Read the article

  • Issue with image lightbox and enlargement / Jquery Mobile

    - by Matt
    I'm working on a redesign of my weather website using Jquery Mobile. I have it set up so that you drill down through a series of content containers to get to the weather info (each group of info opens in a dialog display). Everything's worked well, but I've run into an issue with my images. I have them sized so that they fit a mobile device's screen nicely, but because of that, when you look at them in a desktop browser, you can't really make out what the image is. I've tried several image lightbox / enlargement solutions, but for some reason, none of them have worked. Either nothing happens or the images open in a new window. I thought that this might be caused by Jquery Mobile somehow overwriting the scripts and css of the lightbox / enlargements I've tried. I'm not completely sure though that this is the case, and if it is, how I can get around it to be able to enlarge the images to their original size, preferably onclick. Here is a working (for the most part - still some kinks to work out) example. If you look under the "Tropical" section at the "Satellite-Derived Products", you'll see what I mean. http://www.suncoaststormwatch.com/Beta/Index.html

    Read the article

  • What tools exist for designing layouts and pre-production templates for Rails 3 applications?

    - by rcd
    I develop Rails 3 applications, but prior to this, my background was a designer (typically making mockups in Photoshop and then breaking them down to HTML5/CSS3). Now, some great tools/templates exist for getting working layouts ready for Rails and other apps quickly, e.g., http://railsapps.github.com/rails-composer/. Many are using CSS Frameworks such as Twitter Bootstrap. I'd like to know whether there is a local app (for Mac) that can design layouts, much the way Dreamweaver would, but that are geared towards being utilized in a Twitter Bootstrap situation alongside Ruby (Rails) or Python apps, etc.

    Read the article

  • Monitoring GWT Website

    - by Raf
    We currently monitor our webapps using curl. More and more of our webapps use the GWT framework, which uses tons of javascript, and we can't rely on our curl system to monitor anymore. Therefore, we search the right tool to monitor, but it seems difficult to find a crawler which is light (no Selenium please) but handles javascript correctly. PS : we host our webapps as well as the probes, we don't want any Internet monitoring service.

    Read the article

  • Google Analytics: Why is Avg Time on Site lower than Avg time on Page?

    - by Melanie
    I have the following Custom Report set up in Google Analytics: Metrics: Avg Time on Page Avg Time on Site Dimensions: Page So a report looks like this: Page Avg Time on Page Avg Time on Site /an-article 00:03:14 00:00:11 /another-article 00:05:11 00:01:07 /something-written 00:03:00 00:00:31 Why is it that for each 'page', the 'site views' are significantly lower?

    Read the article

  • Recommend hosting with fast MySQL database please [closed]

    - by Keith Groben
    Possible Duplicate: How to find web hosting that meets my requirements? I am frustrated to no end with my current hosting provider, mediaTemple. Yes, they are flashy, and have some decent degree of flexibility with their GS plan, which I have. But anytime I install a site that needs a database, it is slow. like really slow. Taking anywhere from 10 - 15 seconds just to load a page. I would host in house, but there are a lot of complications that come with a LAMP server that I don't want to deal with. Honestly, I'd rather spend the time developing. What can you recommend?

    Read the article

  • edited and reversed changes on .htaccess - site starts redirecting to .comindex.php/

    - by Aurigae
    Site is a Joomla 2.5 site. I wanted to add a non www to www redirect to the htaccess file, did so, then the redirection went mad, reversed but still the site redirects. When i click view site in admin panel, i get linked to http://domain.comindex.php/ The website is http://www.domain.com Visiting the website URL works without www, but once you click on projects it acts mad too. Projects is managed with joomshopping extension. EDIT: the redirect also happens when rewrite is deactivated in admin panel. ## # @package Joomla # @copyright Copyright (C) 2005 - 2012 Open Source Matters. All rights reserved. # @license GNU General Public License version 2 or later; see LICENSE.txt ## ## # READ THIS COMPLETELY IF YOU CHOOSE TO USE THIS FILE! # # The line just below this section: 'Options +FollowSymLinks' may cause problems # with some server configurations. It is required for use of mod_rewrite, but may already # be set by your server administrator in a way that dissallows changing it in # your .htaccess file. If using it causes your server to error out, comment it out (add # to # beginning of line), reload your site in your browser and test your sef url's. If they work, # it has been set by your server administrator and you do not need it set here. ## ## Can be commented out if causes errors, see notes above. Options +FollowSymLinks ## Mod_rewrite in use. RewriteEngine On ## Begin - Rewrite rules to block out some common exploits. # If you experience problems on your site block out the operations listed below # This attempts to block the most common type of exploit `attempts` to Joomla! # # Block out any script trying to base64_encode data within the URL. RewriteCond %{QUERY_STRING} base64_encode[^(]*\([^)]*\) [OR] # Block out any script that includes a <script> tag in URL. RewriteCond %{QUERY_STRING} (<|%3C)([^s]*s)+cript.*(>|%3E) [NC,OR] # Block out any script trying to set a PHP GLOBALS variable via URL. RewriteCond %{QUERY_STRING} GLOBALS(=|\[|\%[0-9A-Z]{0,2}) [OR] # Block out any script trying to modify a _REQUEST variable via URL. RewriteCond %{QUERY_STRING} _REQUEST(=|\[|\%[0-9A-Z]{0,2}) # Return 403 Forbidden header and show the content of the root homepage RewriteRule .* index.php [F] # ## End - Rewrite rules to block out some common exploits. ## Begin - Custom redirects # # If you need to redirect some pages, or set a canonical non-www to # www redirect (or vice versa), place that code here. Ensure those # redirects use the correct RewriteRule syntax and the [R=301,L] flags. # ## End - Custom redirects ## # Uncomment following line if your webserver's URL # is not directly related to physical file paths. # Update Your Joomla! Directory (just / for root). ## # RewriteBase / ## Begin - Joomla! core SEF Section. # RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}] # # If the requested path and file is not /index.php and the request # has not already been internally rewritten to the index.php script RewriteCond %{REQUEST_URI} !^/index\.php # and the request is for something within the component folder, # or for the site root, or for an extensionless URL, or the # requested URL ends with one of the listed extensions RewriteCond %{REQUEST_URI} /component/|(/[^.]*|\.(php|html?|feed|pdf|vcf|raw))$ [NC] # and the requested path and file doesn't directly match a physical file RewriteCond %{REQUEST_FILENAME} !-f # and the requested path and file doesn't directly match a physical folder RewriteCond %{REQUEST_FILENAME} !-d # internally rewrite the request to the index.php script RewriteRule .* index.php [L] # ## End - Joomla! core SEF Section. Redirect 301 /index.html /index.php Redirect 301 /services /project Redirect 301 /projects/projects.html /project Redirect 301 /projects/project1.html /project Redirect 301 /projects/project2.html /project Redirect 301 /projects /project Redirect 301 /keypersonnel.html /about-agrin/keystaff Redirect 301 /cooperation.htm /about-agrin/intcoop Redirect 301 /member.html /about-agrin/memberships Redirect 301 /contact.html /contacts Redirect 301 /hr.htm /jobs Redirect 301 /index.php/404 /index.php

    Read the article

  • Is there a media player that works on HTTPS sites?

    - by Iain Hallam
    I'm currently using Yahoo! Media Player for a site that needs to play MP3 files that are stored on our server. In total, there's quite a bit more than the free limits at Soundcloud, but each file is only a few minutes long. YMP is pretty good, but causes security warnings on HTTPS pages, because it can only be served via HTTP. Is there an equivalent free player I can embed for the HTTPS pages? EDIT: Just to clarify, I'm initially looking for something that will scan the page and turn media links playable.

    Read the article

  • Question about SEO and Domains

    - by jasondavis
    This is my first post on here as I am mainly on Stackoverflow and Serverfault. I have been programming for at least 10 years now, have made hundreds of websites but I have just recently started getting into Design and the SEO side of sites, sad that I have been overlooking these for so many years. I have pretty good knowledge from all my years of SEO but I have never really looked into it until now. My question, I would like to build a site that targets many different key words for the search engines, for an example. Let's say I built a site about Outdoor activities called outdoorreview.com and I planned on having many sections hunting fishing Hiking camping cycling climbing etc... For best Search Engine results, how could I get the most search engine traffic to all these ares? Also how should I structure the way to get to them, outdoorreview.com/Hiking/ or hiking.outdoorreview.com ?

    Read the article

  • So now Google has said no to old browsers when can the rest of us follow suit?

    - by Richard
    Google recently announced that they will no longer support older browsers on Aug 1st: http://www.bbc.co.uk/news/technology-13639875 http://gmailblog.blogspot.com/2011/06/our-plans-to-support-modern-browsers.html For this reason, soon Google Apps will only support modern browsers. Beginning August 1st, we’ll support the current and prior major release of Chrome, Firefox, Internet Explorer and Safari on a rolling basis. Each time a new version is released, we’ll begin supporting the update and stop supporting the third-oldest version. There is nothing worse than looking at the patching of code that takes place to support older browsers. If we could all move towards a standards only web (I'm looking at you IE9) then surely we could spend more time programming good web apps and less trying to make them run equally on terrible non standards compliant older browsers. So when can the rest of us expect to be able to tell our clients that we no longer support older browsers? Because it seems that large corporates will continue to run older browsers and even if google chrome frame can be installed without admin privileges (it's coming soon, currently in beta) we can't expect all users to be motivated to do this. I appreciate any thoughts.

    Read the article

  • html5 based advertisement guidelines

    - by picus
    I want to experiment with the idea of an html based ad that utilizes my company's search API, is anyone here aware of any rules or documentation (general or per network) that explains the guidelines for creating such ads - ie markup, delivery etc - note this is not a question on how to use my company's api, I already know how to do that. For example, I would like to access the api with jsonp, probably via jquery? Can this be done? Would I host the ad and have loaded via an iframe? I just don't know these things. It is all so new to me... I... I'm scared. Actually, I'm not. However, I would like to know. Thanks in advance.

    Read the article

  • page rank 0 penalty

    - by mark
    I have a wordpress blog and a www-website on the same domain for about one year. Together it is about 170 pages. The page rank is still 0. I understand that page rank 0 is a penalty for duplicate content. The pages are indexed in google but still no page rank. In google webmaster tools there is no indication for any problem. I asked for reconsideration of both blog and website a month ago. Google accepted the reconsideration but it did not change anything. Other pages of similar size and similar audience earn PR 4-6. Is there something I can do in order to get a fair page rank? A coworker told me that it might be the case that a link farm is using the content and I can do nothing about it. Is there a reliable way to check for something like that? I do not like to give up so quickly is there a chance to fix this by for example moving to another domain?

    Read the article

  • Clarity around Advanced Segment defintion

    - by Btibert3
    I am hoping to get some clarity around an advanced segment I created. For context, our website spans multiple domains. For reasons I wont get into, I created an advanced segment that looks for pages containing my subdomain of interest (subdomain.site.com). I want to ensure that my interpretation of this advanced segment is accurate. Simply, it flags all visits to our entire domain that viewed at least one page on my subdomain of interest? If I am off, what does this advanced segment represent? Many thanks in advance!

    Read the article

  • How to support tableless columns with WYSIWYG editor?

    - by Andy
    On the front page of a site I'm working on there's a small slideshow. It's not for pictures in particular, any content can go in, and I'm currently setting up the editing interface for the client. I'd like to be able to have one/two/more columns in the editable area, and ideally that would be via CSS - does anyone know of a WYSIWYG editor that supports this? I'm using Drupal (would prefer not to involve Panels as it would require a bit of work to make it a streamlined workflow for content entry) in case that matters to anyone. To start the ball rolling, one way would be to use templates. I know CKEditor supports templates, and it looks like TinyMCE might have something similar. I don't know how well these work with tableless columns (the CKEditor homepage demo uses tables to achieve its two column effect). Holding out for a cool solution!

    Read the article

  • How to do a 3-tier using PHP [closed]

    - by Ric
    I have a requirement from a client for my PHP Web application to be 3-tier. For example, I would have a web server on Apache in the DMZ, but it should NOT contain any DB connections. It should connect to a Middle server that would host the business objects but be behind the firewall. Then those objects connect to my SQL cluster on another server. I have actually done this using .NET, but I am not sure how to setup my stack using PHP. I suppose I could have my UI front tier call the middle tier using REST based web services if I create my middle tier as a second web server, but this seems overly complex. The main reason for this is advanced security: we can not have any passwords on the DMZ first tier web server. The second reason is scalability - to have multiple server on different tiers that can handle the requests. The Last reason is for deployment - it is easier if I can take one set of servers offline for testing before putting them back in production. Is there a open source project that shows how to do this? The only example I can find is the web server hosting files from a shared drive on another machine (kind of how DotNetNuke pretends to be 3-tier), but that is NOT secure.

    Read the article

  • How can I make sure my website will be available during a presentation?

    - by johnny_s
    I have an online presentation to do next week and I have it all ready to go. The website is HTML and CSS only (no DB), and currently resides on my shared hosting account. Now, although my shared hosting is (relatively) reliable, I have noticed that recently they have been making some changes and my website has been unavailable at times. I don't want this to happen to me on the morning of my presentation, so I am asking what is the best way to prepare for such a thing? My domain is www.presentation.mydomain.com and I would like to keep this if possible (even if issues arise). I have been thinking of a few alternatives: Host my site on two different domains or servers (but what about the domain name?) Have a portable XAMPP version on a USB stick (again, domain name?) Possible failover site/location Update: The presentation will be carried out on their laptop, not mine. So I am unable to install any software.

    Read the article

  • Googlebot requesting invalid url

    - by Rob Walker
    I have a web app which emails me exceptions automatically. This morning there was an error relating to a url: /Catalog/LiveCatalog?id=ylwpfqzts id is invalid (should be a guid) and caused an error parsing. Everything was handled correctly, and an error page is returned. But what was odd is that the user-agent reported itself as Googlebot and the IP is registered to Google. The URL would never have been generated by my web app but doesn't look particularly malicious. Anyone ever seen anything like this?

    Read the article

  • Lazyloading images and SEO

    - by surpr
    Lazyloading images with a noscript fallback. Should I expect any damage in the SERPs? The site is completely thumbnail based. Also should I put a smaller image size in the noscript fallback to increase crawlability? We have nearly 1mil thumbs so it's a decision I'm hesitant to do. The reason why I'm thinking about it in the first place is because we're upping thumbail size about 50% which will add 10% of pagesize.

    Read the article

  • Should webmasters "index" dashboard and edit account page

    - by francoboy7
    New here, I did my research and found nothing, but sorry if it has already been asked. As webmasters should be let google and other search engine INDEX our member's dashboard and edit account page. For example my member John has access to a page name "Edit your account" where he can fill some fields and updates his info. Or another pages where John can manage his posts (edit, delete) Such pages have no interest to the other people so should be let google and others INDEX it or should we NOINDEX it ? Thanks for your time ! Franck

    Read the article

  • Disable outbound links without letting others know that

    - by tadoman
    Is there a way I can tell google not to follow external links ( pointing to other sites) without letting other know. I know you can disable outbound links by putting rel=nofollow or something in robots.txt. But that's something others can see as well. I'm just wondering if there's a way to tell google not to follow those links without letting others know that... like a setting in webmaster tools or something similar ( there's definetly one way. I could set an exception in my conf file for my server to check the user agent to be "googlebot" and then serve a different version of robots.txt. So that when a different user would check that link it would return a different robots.txt thant the one served to googlebot. However I'm not too sure google would be too happy about this) Thank you

    Read the article

  • XMLHttpRequest not working, trying to test database connection [closed]

    - by Frederick Marcoux
    I'm currently creating my own CMS for personnal use but I'm blocked at a code. I'm trying to make a installation script but the AJAX request to test if database works, doesn't work... There's my JS code: function testDB() { "use strict"; var host = document.getElementById('host').value; var username = document.getElementById('username').value; var password = document.getElementById('password').value; var db = document.getElementById('db_name').value; var xmlhttp = new XMLHttpRequest(); var url = "test_db.php"; var params = "host="+host+"&username="+username+"&password="+password+"&db="+db; xmlhttp.open("POST", url, true); xmlhttp.setRequestHeader("Content-type", "application/x-www-form-urlencoded"); xmlhttp.setRequestHeader("Content-length", params.length); xmlhttp.setRequestHeader("Connection", "close"); xmlhttp.send(params); $('#loader').removeAttr('style'); if (xmlhttp.responseText !== '') { if (xmlhttp.readyState===4 && xmlhttp.status===200) { $('#next').removeAttr('disabled'); $('#test').attr('disabled', 'disabled'); $('#test').text('Connection Successful!'); $('#test').addClass('btn-success'); $('#login').addClass('success'); $('#login1').addClass('success'); $('#db').addClass('success'); $('#loader').attr('style', 'display: none;'); } else { $('#next').attr('disabled', 'disabled'); $('#test').removeClass('btn-success'); $('#test').removeAttr('disabled'); $('#test').text('Test Connection'); $('#login').removeClass('success'); $('#login1').removeClass('success'); $('#db').removeClass('success'); $('#loader').attr('style', 'display: none;'); } } else { $('#next').attr('disabled', 'disabled'); $('#next').attr('disabled', 'disabled'); $('#test').removeClass('btn-success'); $('#test').removeAttr('disabled'); $('#test').text('Test Connection'); $('#login').removeClass('success'); $('#login1').removeClass('success'); $('#db').removeClass('success'); $('#loader').attr('style', 'display: none;'); } } And there's my PHP code: <?php $link = mysql_connect($_POST['host'], $_POST['username'], $_POST['password']); if (!$link) { echo ''; } else { if (mysql_select_db($_POST['db'])) { echo 'Connection Successful!'; } else { echo ''; } } mysql_close($link); ?> I don't know why it doesn't work but I tried with JQuery $.ajax, $.get, $.post but nothing work...

    Read the article

  • Google search question, front page not showing

    - by user5746
    I know this is probably a dumb question but I hope someone can give me some insight; I was ranked on Google first page of search results for "funny st patricks day shirts" but I was third from the bottom and not familiar enough with SEO, so I signed up for "Attracta" to rank higher. Big mistake. Since using Attracta, I've lost the first page and I'm now on the fourth page in that search. What I noticed is that Google is now just showing a sub-page or side page, (a link from my front page, to a page which has only a few designs in it) this is not where I would want customers to land first... but my front page is not showing in that search anymore. Obviously, the title of this side page is not geared toward that search result, so I know that's why I have the pr drop. Why is my front page not ranking over that page, though? Why is it apparently gone from that search, or so far back no one will ever find it? I need to know how to fix this quick if anyone has any advice at all for me. It's the busiest season for my website and the people who were stealing design ideas from me are all ranked higher than my site now. (I can prove this, lol) So, I'm very frustrated by that. I would be very grateful to have any advice at all as to what I can do to fix this. THANKS in advance for any advice you can offer. Catelyn

    Read the article

  • Web stalker has purchased a domain name that uses my personal name, web page is defamatory [closed]

    - by Deborah Morse-Kahn
    We have been unsuccessful in persuading a stalker's website host to release the domain name he purchased which is my own personal name, e.g., PERSONALNAME.com. You will find my name below in the signature area. Look for yourself. On the one page that this domain name leades to is dreadful and defamatory material. No attorney has felt it worth their time to chase this issue down, and we cannot afford to go to a national or international dispute resolution group to bring this issue to WHOIS. Worse, the stalker is amoral and a psychopath: he would just love the attention. We've even consider trying to find someone to illegally hack into the webpage to at least redirect the domain pointers to my own professional website. This issue has continued now for two years and is affecting my professional reputations as potential clients have looked for me online. Is there any remedy? Your help and advice would be greatly welcomed.

    Read the article

  • How to proxy with apache site from same domain but another port as a subfolder?

    - by myWallJSON
    So I have a problem - I have my main site on apache web server on debian on port 80; I develop a web server (in some C++ or C#) and it currently runs on port 6666. But some people are living under firewalls and can access only port 80. I wonder if it is possible via apache map all requests to say mysite.com:80/6666/url as if they were to mysite.com:6666/url, not map via redirection, but really make apache stream content from my site to user as if it were in some folder?

    Read the article

< Previous Page | 180 181 182 183 184 185 186 187 188 189 190 191  | Next Page >