Search Results

Search found 10402 results on 417 pages for 'macbook pro'.

Page 201/417 | < Previous Page | 197 198 199 200 201 202 203 204 205 206 207 208  | Next Page >

  • htaccess url rewrite

    - by user761396
    i used to have a rewrite rule to RewriteRule ^([^/]+)\.htm$ index.php?c=$1 [NC] RewriteRule ^([^/]+)\.htm/([0-9.]+)$ index.php?c=$1&amt=$2 [NC] now, i have to change to RewriteRule ^1/([^/]+)\.htm$ index.php?c=$1 [NC] RewriteRule ^1/([^/]+)\.htm/([0-9.]+)$ index.php?c=$1&amt=$2 [NC] RewriteRule ^2/([^/]+)\.htm$ index2.php?c=$1 [NC] RewriteRule ^2/([^/]+)\.htm/([0-9.]+)$ index2.php?c=$1&amt=$2 [NC] (the difference is adding a subdirectory) my question is how to redirect my old ones to the 1/ subdirectory? thank you

    Read the article

  • Site overthrown by Turkish hackers...

    - by Jackson Gariety
    Go ahead, laugh. I forgot to remove the default admin/admin account on my blog. SOmebody got in and has replaced my homepage with some internet graffiti. I've used .htaccess to replace the page with a 403 error, but no matter what I do, my wordpress homepage is this hacker thing. How can I setup my server so that ONLY MYSELF can view it while I'm fixing this via .htaccess? What steps should I take to eradicate them from my server? If I delete the ENTIRE website and change all the passwords, is he completely gone? Thanks.

    Read the article

  • Problem with missing JSON functions on PHP 5.2.6 / Plesk 8.4

    - by Drachenviech
    I have a vserver running openSuse 10.3, Apache 2 and Plesk 8.4. I can update/upgrade neither, as it is apparently not recommended to upgrade openSuse 10.3 (and an update to the EOL 10.4 does not seem to make much sense) and Plesk fails to update no matter what version I try (even fails to upgrade to 8.4.1). Still I can live with that somehow, primarily because I don’t have the time to do a fresh remote install on the vserver. What really is a problem is, that though the installed PHP is 5.2.6 it has no zip library and no json functions. The first is probably because PHP was not compiled with --enable-zip. The second is a big mystery though. As I understand it, it always comes with PHP unless its compiled with the --disable-json configure option. This is however not the case. And the json extension module is just not there. I even tried to enable it with extension=json.so with no luck either. the configure options of my PHP are (as shipped with Plesk 8.4) '../configure' '--prefix=/usr' '--datadir=/usr/share/php5' '--mandir=/usr/share/man' '--bindir=/usr/bin' '--with-libdir=lib' '--includedir=/usr/include' '--sysconfdir=/etc/php5/apache2' '--with-config-file-path=/etc/php5/apache2' '--with-config-file-scan-dir=/etc/php5/conf.d' '--enable-libxml' '--enable-session' '--with-mm' '--with-pcre-regex=/usr' '--enable-xml' '--enable-simplexml' '--enable-spl' '--enable-filter' '--disable-debug' '--enable-inline-optimization' '--disable-rpath' '--disable-static' '--enable-shared' '--program-suffix=5' '--with-pic' '--with-gnu-ld' '--with-system-tzdata=/usr/share/zoneinfo' '--with-apxs2=/usr/sbin/apxs2' '--disable-all' '--disable-cli' As I understand it, PECL is not an option with 5.2.6. Or am I mistaken? Even if I was not, the openSuse repository only goes as far as PHP 5.2.4. The openSuse install even came without zypper, which I had to manually install. So is there a way to get ziplib and json running in PHP 5.2.6 without having to recompile the binary?

    Read the article

  • Github Feed affecting my WordPress installation? [on hold]

    - by saul
    Any idea how this fork is affecting my site? I went to verify my website log stats, and realized this may be the cause of a strange redirect constantly happening on my WordPress installation. Here's a line I found on my log: 54.81.91.95 - - [07/May/2014:22:52:08 -0400] "GET /category/selfie/feed/ HTTP/1.1" 200 1826 "-" "feedzirra http://github.com/pauldix/feedzirra/tree/master" And this is the Github fork (or however these are called). https://github.com/feedjira/feedjira/tree/master Basically, I think everytime I update my categories, (selfie in this case), I get redirected to install.php. Probably by triggering some GET function on that feed. to the best of my knowledge, this feed parses all url with this structure, blocking them, kind of like a DDoS attack?? Any ideas how to go about it??

    Read the article

  • Do extra words in url affect SEO?

    - by smp7d
    Often for technical reasons we end up with some extra words in a url that we would not want to optimize for as they would have no bearing on the content. Examples would be: sportssite.com/content/sports-article movieportal.com/node/movie-review electronicsforum.com/blog/top-10-cameras webmasters.stackexchange.com/questions/34046/do-extra-words-in-url-affect-seo Do these have any affect on ranking in any of the major search engines? Would it behoove us to strip the extra words?

    Read the article

  • Meta Refresh for change of page name and content

    - by user3507399
    Hopefully just a quick one. I've got a client that is changing the name of a workshop that they run. This means a change of url, page title for keywords that they have first page ranking on. The keywords are still relevant so what I want to avoid is a 301 redirect to a page that has different keywords to the previous page. Is the best option to keep the old page live with url and title and use a meta refresh to redirect after a period of time (not instant)? That way the SEO ranking is retained for the previous workshop name while they work on the ranking for the name change? Would a 301 redirect have an inverse effect? Thanks!

    Read the article

  • What percent of visitors should click on the next page before you enable prefetching?

    - by Kevin Burke
    Mozilla Firefox and Google Chrome support prefetching via an HTML tag: <!-- in chrome --> <link rel="prerender" href="http://example.org/index.html"> I suppose it is always worthwhile to include this tag if 100% of users on a page click on the "Next Page" button or similar, and never worthwhile to include it if only 2% or 3% of users visit the following page. At what percent of clicks should you turn on prefetching of the next page? 65%? Also, does the calculus change if the current page is HTTP and the next page is HTTPS?

    Read the article

  • PHP-FPM stops responding and dies [migrated]

    - by user12361
    I'm running Drupal 6 with Nginx 1.5.1 and PHP-FPM (PHP 5.3.26) on a 1GB single core VPS with 3GB of swap space on SSD storage. I just switched from shared hosting to this unmanaged VPS because my site was getting too heavy, so I'm still learning the ropes. I have moderately high traffic, I don't really monitor it closely but Google Adsense usually record close to 30K page views/day. I usually have 50 to 80 authenticated users logged in and a few hundred more anonymous users hitting the Boost static HTML cache at any given moment. The problem I'm having is that PHP-FPM frequently stops responding, resulting in Nginx 502 or 504 errors. I swear I have read every page on the internet about this issue, which seems fairly common, and I've tried endless combinations of configurations, and I can't find a good solution. After restarting Nginx and PHP-FPM, the site runs really fast for a while, and then without warning it simply stops responding. I get a white screen while the browser waits on the server, and after about 30 seconds to a minute it throws an Nginx 502 or 504 error. Sometimes it runs well for 2 minutes, sometimes 5 minutes, sometimes 5 hours, but it always ends up hanging. When I find the server in this state, there is still plenty of free memory (500MB or more) and no major CPU usage, the control and worker PHP-FPM processes are still present, and the server is still pingable and usable via SSH. A reload of PHP-FPM via the init script revives it again. The hangups don't seem to correspond to the amount of traffic, because I observed this behavior consistently when I was testing this configuration on a development VPS with no traffic at all. I've been constantly tweaking the settings, but I can't definitively eliminate the problem. I set Nginx workers to just 1. In the PHP-FPM config I have tried all three of the process managers. "Dynamic" is definitely the least reliable, consistently hanging up after only a few minutes. "Static" also has been unreliable and unpredictable. The least buggy has been "ondemand", but even that is failing me, sometimes after as much as 12 to 24 hours. But I can't leave the server unattended because PHP-FPM dies and never comes back on its own. I tried adjusting the pm.max_children value from as low as 3 to as high as 50, doesn't make a lot of difference, but I currently have it at 10. Same thing for the spare servers values. I also have set pm.max_requests anywhere from 30 to unlimited, and it doesn't seem to make a difference. According to the logs, the PHP-FPM processes are not exiting with SIGSEGV or SIGBUS, but rather with SIGTERM. I get a lot of lines like: WARNING: [pool www] child 3739, script '/var/www/drupal6/index.php' (request: "GET /index.php") execution timed out (38.739494 sec), terminating and: WARNING: [pool www] child 3738 exited on signal 15 (SIGTERM) after 50.004380 seconds from start I actually found several articles that recommend doing a graceful reload of PHP-FPM via cron every few minutes or hours to circumvent this issue. So that's what I did, "/etc/init.d/php-fpm reload" every 5 minutes. So far, it's keeping the lights on. But it feels like a dreadful hack. Is PHP-FPM really that unreliable? Is there anything else I can do? Thanks a lot!

    Read the article

  • Configure htaccess to show index.php as the default page instead of permissions error

    - by Jan De Laet
    Having a problem with my .htaccess. I have this to secure all my documents: Order Deny,Allow Deny from all Allow from 127.0.0.1 <FilesMatch "\.(htm|html|css|js|php)$"> Order Allow,Deny Allow from all Allow from 127.0.0.1 </FilesMatch> Now everything works fine except that the index page of www.mysite.com doesn't work and gives me the notification: You don't have permission to access / on this server. How can you fix this? If there stands www.example.com/index.php it works but if you surf to www.example.com I get this message.

    Read the article

  • Protecting design ideas from being copied by other websites?

    - by mickburkejnr
    Hi everyone, I'm planning a project at the moment, while building a completely different project at the same time. Both of these projects are quite innovative in the way they either work or the way they are presented. One of the projects hasn't been done before, and the other is being made has competition, but I feel the competitions websites are light years behind what I'm doing. Is there a way for me to prevent the way my sites work or presented from being stolen? I've thought of patenting parts of them, but it requires £10,000 and I don't have that amount of money. Also, would me putting a Copyright notice on the site or an All Rights Reserved tag give me any muscle when going to websites that I feel have stolen my ideas (if they have)? Cheers!

    Read the article

  • What to use for "localhost" that includes PHP/SQL functionality?

    - by Jack
    I really do not want to install Linux at the moment, I would have to borrow a USB key, move files, format it, flash it, format it again, move files back, give it back... What would you recommend that is lightweight, easily and cleanly uninstallable afterwards (will install Linux when I get a new DVD-ROM, which will be in ~2 weeks), that also supports PHP and SQL? To be precise, I want to install a Wordpress blog, a few plugins, etc, and develop a theme. If there is no such thing for Windows (7, x64 if that matters), let me know too, I will borrow the USB key then (even though it's a pain).

    Read the article

  • Dedicated server: managed hosting or manage it myself?

    - by ddawber
    We're currently hosting a number of sites on a self-managed dedicated server. Some companies, however, offer a managed dedicated server hosting service. They offer: Roughly the same server spec Ticketing system support Managed daily backups Virtual firewall (but with a limit of 10 IP addresses allowed through at any one time) Now, this managed hosting is at extra expense - somewhere in the region of $500 per month, and the limit on the number of IP addresses they'll manage on the firewall is also a real pain. My thinking is it would be better and cheaper to Stay with the same host since the dedicated box is fine Get an Amazon AWS account and use their server to manage backups; there are a number of good tools that can be used to automate the process Configure iptables so that I have complete control of the firewall I want to know Is a managed virtual firewall likely to be more secure than me configuring iptables? Whether, in your opinion, it's best to let someone else take care of backups? If, from your experience, there's anything else i'm missing that warrants using managed hosting over a DIY service? I think there is some reluctance to not having managed hosting since a managed host in effect takes responsibility for your server, whereas any hardware or security issues with a server that we manage would mean we are forced to hold our hands up when a client site goes down. That said, I personally don't think a managed host does that much in the day to day running of your server (backups are automatic, OS updates are carried out with ease, etc.).

    Read the article

  • Is my project a web site or a mobile app?

    - by Evik James
    I am a web developer. I have been developing web sites professionally for 15+ years using ColdFusion, SQL Server, and jQuery. I am doing a project for a client that uses the above technologies. The site has a mobile facet too. For that, I am using jQuery Mobile. The site enables retail store personnel to gather a store customer's preferences using any smart phone or tablet. The store personnel just needs to access a special link via a QR code and a login. Anyone can easily access the site from a PC's browser, too. Some sources suggest that mobile apps must be downloaded and installed using a third party, such as from Google, Amazon, or Apple. Others sources suggest that any information designed for use on mobile device is a mobile app. Regarding the site that is specifically designed for use by mobile devices that extensively uses jQuery Mobile, is this a "web site" or a "mobile app"? What is the "proper" description for this type of site? My customer insists that it is one and not the other. I insist that it is the other and not the one. Can you help me clarify this?

    Read the article

  • Recommend an open source CMS for single page web site

    - by RedMan
    Hi I want to create a single page web site like http://kiskolabs.com/ or http://www.carat.se to display my portfolio. I want to add new products after launching the site without having to edit the entire site. I've looked at opencart (too much for single page site), Magneto (more for ecommerce), Wordpress (couldn't find open source / free templates which i can start from). Can you suggest a CMS which will support the creation of a single page site and allow insertion of new products without having to edit the entire page? I would prefer a CMS which also has open source / free templates which I can tweak for my use. I can do php and mysql, xml. If it is an easier option I can do PSD to site (but don't know much about this at all).

    Read the article

  • RewriteRule working local but not on remote server

    - by m0tv
    I have a .htaccess file with one simple RewriteRule: RewriteEngine on RewriteRule ^([A-Za-z0-9-]+)$ ?site=$1 I want to have an url like http://www.example.com/imprint and forward it to http://www.example.com/?site=imprint I checked this rule with an RewriteRule tester which gave me the results I want to achieve. On my local development system it works well too. But on a remote server the URLs just give me a 404 error. Other more simple rewrite rules are working with no problems, so everything must be set up correctly (I think..). The problem is that I don't have access to any error logs or the server configs. So the only thing I can do is to guess... Can anyone tell me if theres something wrong with this rule? Or anything else I can do or test to solve this? Or has someone an idea what could be wrong on the server?

    Read the article

  • Which folders need to be backed up for migration in Joomla?

    - by Devdatta Tengshe
    I'm helping someone update & migrate their old website, built on the Joomla framework. Currently it is running on Joomla 1.5.8 which is an ancient version. I've convinced them to upgrade Joomla to at least 2.5 I have already made a backup of the database. Most links I have seen talk of backing up the entire public_html folder (The website runs on a shared host). But in my fresh Joomla installation there are several folders that are in the public_html folder. So which of the folders in the public_html folder are from the content of the website, and which are of the old Joomla framework? I'm afraid that I might overwrite files of the new Joomla framework with the old framework, if copy all the files and folders into the new installation.

    Read the article

  • How would i rank my keywords in Yahoo search engine?

    - by user1430715
    I am working as search engine optimizer team lead in a company and facing problem in a project which name is http://www.Prooftech.com.sg... Problem :- The Website has 10 keywords for which my client wanted the top 10 Ranking in Yahoo Singapore search engine. I have got top 10 ranking for the following 7 keywords Waterproofing, RC Roof ,Wall Leakages ,Ceiling Leakages , Water Leakages ,Roof Tile Coating ,Roof Tiles Repair in my 3 months work but still i am not getting the listing positions for Roof ,Concrete Repair ,Grouting .... I have Done lot of Bookmarking ,Blog Commenting ,Blog Creations ,Press Release,Classified Ads to get these 3 keywords in listing but there is no changes in the results.... Can any help me out from this problem so i can get Good rankings for Roof ,Concrete Repair ,Grouting

    Read the article

  • Looking for recommendations for a server-side newsletter program

    - by Sparky672
    Hello- I'm currently using a server-side SQL based mailing list program called Php-List on multiple sites and it works fairly well. But installation and setup is quite cumbersome, quirky and the interface is not well organized... neither is the code... with pieces all over the place in random fashion. Customizing the "look & feel" and full site integration are both tedious and painful. Upgrading the version is made more complex since multiple edits need to be manually transferred each time. Also, probably due to a poor English translation, descriptions and instructions within certain areas of the user interface are contradictory and unclear. You just have to play with it and remember what you did last time it worked. It's supposed to be so my customers can send out their own newsletters... after supplying a written tutorial, about half of them seem to stumble through it okay and the other half just hire me to do it for them. So not quite easy enough for most average people to use. I'm looking for something that's as easy for them as using a blog or discussion forum. It also must be easier to set up and integrate into a site than Php-List. I have no problem getting dirty and writing CSS or HTML by hand. Nor do I have any problem editing the program code. Perhaps what I'm looking for is a solution that is more organized, a better GUI, and template or "skin" based. Therefore, if I spend many hours customizing a skin, I can simply update the program and re-use my custom skin without having to reproduce the tedious setup over and over. (I currently maintain a list of about 25 things I must manually edit or add to multiple files in multiple directories each time I install or upgrade Php-List) A great example of what I'm looking for is very much like WordPress or phpBB. They're both easy to install and customize yet powerful and packed full of features. They're also VERY well organized making customization less painful. So enough yammering for now... anyone know of something, besides Php-List, with many of the same features as Php-List; maintaining a mailing list with a server-side database, custom sign-up pages, automatic opt-in opt-out, allowing custom HTML newsletter templates, etc? Thank-you!

    Read the article

  • Will removing unused query string parameters negatively affect SEO?

    - by trm
    Will changing links to remove query string parameters that are no longer used have any negative impact on search engine rankings? Say I have a page about.php on my site, and all of my links to this page are of the form http://www.example.com/about.php?foo=bar and I've made some changes to the script such that the parameter foo is no longer used. I would like to remove the unused parameter from the links so the URL will look cleaner, but I am concerned that this could cause problems with SEO. Is it safe to remove ?foo=bar from my links?

    Read the article

  • Google Analytics on Demo Site

    - by Josh Smith
    Will adding the UA code of the live site to a revision site affect anything adversely? They are, technically, two different sites with different metrics. I don't want to lose the old data when I initiate the new site, of course. I would also like to work on setting up the new analytics page while the revision site is in development. Does anyone have any good workflows on setting up a revision site without losing old site data?

    Read the article

  • Redirect Google crawler to different robots.txt via .htaccess

    - by user3474818
    I have googled for the answer all day and still couldn't find an answer. I have a virtual subdomain www.static.example.com which is a mirror site of www.example.com. It means I have just one root folder for subdomain and domain aswell. I want to redirect crawlers to different robots.txt file - robots_static.txt when they see .static in url in which I will forbid indexing via /disallow command. I want to do this because I have duplicated content in Google search results. Subdomain is showing the exact same content as the main domain. Does anyone know how could I achieve that crawlers sees robots_static.txt instead of robots.txt? What I have managed to find so far is this: RewriteCond %{HTTP_HOST} ^www.static.*$ [NC] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*robots\.txt.*\ HTTP/ [NC] RewriteRule ^robots\.txt /robots_static.txt [NC,L] but when I check in webmaster tools, it still sees robots.txt as my robots file instead of robots_static.txt, so it crawls and index everything twice. What did I do wrong? Thanks EDIT: This is my .htaccess file ## # @package Joomla # @copyright Copyright (C) 2005 - 2013 Open Source Matters. All rights reserved. # @license GNU General Public License version 2 or later; see LICENSE.txt ## ## # READ THIS COMPLETELY IF YOU CHOOSE TO USE THIS FILE! # # The line just below this section: 'Options +FollowSymLinks' may cause problems # with some server configurations. It is required for use of mod_rewrite, but may already # be set by your server administrator in a way that dissallows changing it in # your .htaccess file. If using it causes your server to error out, comment it out (add # to # beginning of line), reload your site in your browser and test your sef url's. If they work, # it has been set by your server administrator and you do not need it set here. ## ## Can be commented out if causes errors, see notes above. Options +FollowSymLinks ## Mod_rewrite in use. RewriteEngine On RewriteEngine On RewriteCond %{HTTP_HOST} !^www\. RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L] RewriteCond %{HTTP_HOST} ^www.static.*$ [NC] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*robots\.txt.*\ HTTP/ [NC] RewriteRule ^robots\.txt /robots_static.txt [NC,L] ## Begin - Rewrite rules to block out some common exploits. # If you experience problems on your site block out the operations listed below # This attempts to block the most common type of exploit `attempts` to Joomla! # # Block out any script trying to base64_encode data within the URL. RewriteCond %{QUERY_STRING} base64_encode[^(]*\([^)]*\) [OR] # Block out any script that includes a <script> tag in URL. RewriteCond %{QUERY_STRING} (<|%3C)([^s]*s)+cript.*(>|%3E) [NC,OR] # Block out any script trying to set a PHP GLOBALS variable via URL. RewriteCond %{QUERY_STRING} GLOBALS(=|\[|\%[0-9A-Z]{0,2}) [OR] # Block out any script trying to modify a _REQUEST variable via URL. RewriteCond %{QUERY_STRING} _REQUEST(=|\[|\%[0-9A-Z]{0,2}) # Return 403 Forbidden header and show the content of the root homepage RewriteRule .* index.php [F] # ## End - Rewrite rules to block out some common exploits. ## Begin - Custom redirects # # If you need to redirect some pages, or set a canonical non-www to # www redirect (or vice versa), place that code here. Ensure those # redirects use the correct RewriteRule syntax and the [R=301,L] flags. # ## End - Custom redirects ## # Uncomment following line if your webserver's URL # is not directly related to physical file paths. # Update Your Joomla! Directory (just / for root). ## # RewriteBase / RewriteCond %{THE_REQUEST} ^GET.*index\.php [NC] RewriteCond %{THE_REQUEST} !/system/.* RewriteRule (.*?)index\.php/*(.*) /$1$2 [R=301,L] RewriteCond %{THE_REQUEST} ^GET ## Begin - Joomla! core SEF Section. # RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}] # # If the requested path and file is not /index.php and the request # has not already been internally rewritten to the index.php script RewriteCond %{REQUEST_URI} !^/index\.php # and the request is for something within the component folder, # or for the site root, or for an extensionless URL, or the # requested URL ends with one of the listed extensions RewriteCond %{REQUEST_URI} /component/|(/[^.]*|\.(php|html?|feed|pdf|vcf|raw))$ [NC] # and the requested path and file doesn't directly match a physical file RewriteCond %{REQUEST_FILENAME} !-f # and the requested path and file doesn't directly match a physical folder RewriteCond %{REQUEST_FILENAME} !-d # internally rewrite the request to the index.php script RewriteRule .* index.php [L] # ## End - Joomla! core SEF Section. <FilesMatch "\.(ico|pdf|flv|jpg|ttf|jpg|jpeg|png|gif|js|css|swf)$"> Header set Expires "Wed, 15 Apr 2020 20:00:00 GMT" Header set Cache-Control "public" </FilesMatch> <ifModule mod_headers.c> Header set Connection keep-alive </ifModule> ########## Begin - Remove Etags # FileETag none # ########## End - Remove Etags

    Read the article

  • Does Fetch as Googlebot still support their ajax-crawling proposal?

    - by Gunchars
    I spent half a day implementing the server side html generation for modal pages based on their proposal (link), but it seems like the Fetch as Googlebot functionality in Webmaster tools completely ignores the URL fragment. I've verified that the _escaped_fragment_ functionality is working on my server (example), but when I submit a URL like /#!/recipes, the Googlebot just fetches /. There aren't any recent confirmations that it's working and, honestly, it wouldn't surprise me if they just silently dropped the functionality without even editing the docs.

    Read the article

  • PPC Affiliate networks for web-applications [on hold]

    - by machete
    I want to run a browser-based social media (Twitter, Instagram,...) account management tool (justunfollow.com) which monetizes through ads. But many affiliate networks like Google AdSense or media.net require your website to have "high quality content". AdSense explicitly states: "Google ads, search boxes or search results may not be: * Integrated into a software application (does not apply to AdMob) of any kind, including toolbars. * Placed on any non-content-based page. (Does not apply to AdSense for search, mobile AdSense for search, or AdMob.)" Are there serious & trustworthy affiliate networks which allow ads to be published on a web application?

    Read the article

  • Correct microdata and/or microformats for real estate listings?

    - by Ernests Karlsons
    Given I am running a real estate rentals listing website, what would be the correct microdata or microformats for the listing pages? There is the usual data: address, photos, price, start date, possible end date, person who is renting it out, list of amenities, description etc. Are there also microformats/microdata that can be used in the listing summary page (e.g., page that displays all listings in a particular city)?

    Read the article

  • revived closed tab in chrome doesn't work properly, but works correctly in IE and Firefox

    - by Kravlin
    I'm working on a website where information is loaded from a calendar. If a user clicks on a link on the calendar, it displays information about that link. If I close that page, and then re-open it, it works properly on both IE and Firefox, but if I open it in chrome, and click on another item on the calendar it instead errors out. Is there a large difference in how chrome brings back tabs from how IE or Firefox do that would cause this?

    Read the article

< Previous Page | 197 198 199 200 201 202 203 204 205 206 207 208  | Next Page >