Search Results

Search found 9717 results on 389 pages for 'pro metedor'.

Page 71/389 | < Previous Page | 67 68 69 70 71 72 73 74 75 76 77 78  | Next Page >

  • Is it fine to put different category of stuff in a single domain name?

    - by Fahad Uddin
    I own a website which is regarding startups and finance. I am looking forward to work on Wordpress programming in which I would be selling wordpress themes. I thought of buying a domain name for Wordpress website but it takes quite lot of time to setup a website and then do its SEO. Is it fine(in terms of SEO and professionalism) to put the Wordpress category inside my old domain like, Domain: www.startupsandfinance.com Wordpress domain, www.wp.startupsandfinance.com

    Read the article

  • Why deny access to website for msnbot/bingbot?

    - by Quandary
    I've seen quite a lot of tutorials that recommend you to ban user agents containing the strings libwww-perl and msnbot. I understand why one would ban libwww-perl, it's mainly if not only used for hacking and spamming. But why are there so many sites recommending to ban msnbot/bingbot? Since it's a search engine, even if only with a marginal market share, I would except one would want this bot to crawl one's sites. What is it that msnbot does that makes people ban it?

    Read the article

  • Format numbers with css

    - by Luc M
    Is it possible to format numbers using css ? When I have 7000000.00, I would like it displayed as 7 000 000.00 I know I could write a backend (php, perl...) function or a javascript function that could return the formatted number but... The numbers that I want to format are into a cell. I would like to have something like <td class="myformat">7000000.00</td> or <td><span class="myformat">7000000.00<span></td>

    Read the article

  • How to force browsers to always reload xslt files?

    - by bitmask
    Related: Apache: How can I force the browser to reload CSS files? I'm building an xml page (on an apache2) that is supposed to be translated to xhtml by the browser, so my server also serves a main.xslt which is used as stylesheet by the xml file, similar to the scenario with the css files in the linked question. However, none of tricks provided in either that answer, nor some issues on SO solve the issue for Opera. While Firefox responds to F5 by fetching not only the xml file but also the xslt file, Opera only reloads the xml file. I tried both, setting the Last-Modified HTTP header via an .htaccess file and using the expires module of apache2. This is what my .htaccess looks right now: AddType text/xsl;charset=utf-8 .xslt ExpiresByType text/xsl "modification plus 1 second" Header set Last-Modified "Wed, 08 Jan 2000 23:11:55 GMT" #Header set Last-Modified "Wed, 08 Jan 2020 23:11:55 GMT" If I open the xsl myself and manually reload it, the xml presentation is updated as well, but this is tedious for development. Note: There is no php or any kind of scripting involved. Everything is static.

    Read the article

  • Url rewrite subfolder to root and forbid accessing subfolder

    - by Alessandro Pezzato
    I have drupal installed in a subfolder drupal, but I want to access pages as it is in root folder: http://www.example.com instead of http://www.example.com/drupal I'm able to have this working, but it's also working with url containing subfolder, so I have http://www.example.com and a clone site in http://www.example.com/drupal What is the rule to forbid access to subfolder? I want all url starting with http://www.example.com/drupal being forbidden. This is .htaccess in / directory: Options -Indexes Options +FollowSymLinks <IfModule mod_rewrite.c> RewriteEngine on RewriteCond %{HTTP_HOST} ^www\.(.+)$ [NC] RewriteRule ^ http://%1%{REQUEST_URI} [L,R=301] RewriteRule ^(.*+)$ drupal/$1 [L,QSA] </IfModule> And this is drupal .htaccess in /drupal/ directory: Options -Indexes Options +FollowSymLinks ErrorDocument 404 index.php DirectoryIndex index.php index.html index.htm # Override PHP settings that cannot be changed at runtime. See # sites/default/default.settings.php and drupal_initialize_variables() in # includes/bootstrap.inc for settings that can be changed at runtime. # PHP 5, Apache 1 and 2. <IfModule mod_php5.c> php_flag magic_quotes_gpc off php_flag magic_quotes_sybase off php_flag register_globals off php_flag session.auto_start off php_value mbstring.http_input pass php_value mbstring.http_output pass php_flag mbstring.encoding_translation off </IfModule> # Requires mod_expires to be enabled. <IfModule mod_expires.c> # Enable expirations. ExpiresActive On # Cache all files for 2 weeks after access (A). ExpiresDefault A1209600 <FilesMatch \.php$> # Do not allow PHP scripts to be cached unless they explicitly send cache # headers themselves. Otherwise all scripts would have to overwrite the # headers set by mod_expires if they want another caching behavior. This may # fail if an error occurs early in the bootstrap process, and it may cause # problems if a non-Drupal PHP file is installed in a subdirectory. ExpiresActive Off </FilesMatch> </IfModule> # Various rewrite rules. <IfModule mod_rewrite.c> RewriteEngine on # Block access to "hidden" directories whose names begin with a period. This # includes directories used by version control systems such as Subversion or # Git to store control files. Files whose names begin with a period, as well # as the control files used by CVS, are protected by the FilesMatch directive # above. RewriteRule "(^|/)\." - [F] # To redirect all users to access the site WITH the 'www.' prefix, # (http://example.com/... will be redirected to http://www.example.com/...) # uncomment the following: # RewriteCond %{HTTP_HOST} !^www\. [NC] # RewriteRule ^ http://www.%{HTTP_HOST}%{REQUEST_URI} [L,R=301] # # To redirect all users to access the site WITHOUT the 'www.' prefix, # (http://www.example.com/... will be redirected to http://example.com/...) # uncomment the following: RewriteCond %{HTTP_HOST} ^www\.(.+)$ [NC] RewriteRule ^ http://%1%{REQUEST_URI} [L,R=301] RewriteBase /drupal # Pass all requests not referring directly to files in the filesystem to # index.php. Clean URLs are handled in drupal_environment_initialize(). RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_URI} !=/favicon.ico #RewriteRule ^ index.php [L] RewriteRule ^(.*)$ index.php?q=$1 [L,QSA] # Rules to correctly serve gzip compressed CSS and JS files. # Requires both mod_rewrite and mod_headers to be enabled. <IfModule mod_headers.c> # Serve gzip compressed CSS files if they exist and the client accepts gzip. RewriteCond %{HTTP:Accept-encoding} gzip RewriteCond %{REQUEST_FILENAME}\.gz -s RewriteRule ^(.*)\.css $1\.css\.gz [QSA] # Serve gzip compressed JS files if they exist and the client accepts gzip. RewriteCond %{HTTP:Accept-encoding} gzip RewriteCond %{REQUEST_FILENAME}\.gz -s RewriteRule ^(.*)\.js $1\.js\.gz [QSA] # Serve correct content types, and prevent mod_deflate double gzip. RewriteRule \.css\.gz$ - [T=text/css,E=no-gzip:1] RewriteRule \.js\.gz$ - [T=text/javascript,E=no-gzip:1] <FilesMatch "(\.js\.gz|\.css\.gz)$"> # Serve correct encoding type. Header append Content-Encoding gzip # Force proxies to cache gzipped & non-gzipped css/js files separately. Header append Vary Accept-Encoding </FilesMatch> </IfModule> </IfModule>

    Read the article

  • Writing files in a sub folder of the web folder (apache security)

    - by Homunculus Reticulli
    I need to save session data for a dynamic web page script by writing to file. I have two questions: Are there any security preferences as to whether to save the data UNDER the web folder, or OUTSIDE the web folder? I attempted to write to the folder an (unsuprisingly), I had a 'file permission refused' type error. Should I set the folder ownership to the apache user (600, 640 or 644?) [[Edit]] core <- 'OUTSIDE' web folder (php script live here) data <- 'OUTSIDE' web folder (session data and other misc data resides here) web <- web root folder js <- any folder below is 'INSIDE' the web folder css html For example, in a php script (i.e. a dynamic PHP page), I can attempt to write to a file using something like fput('../data',data) yet (as I understand it) ../data should not be accessible - for security reasons. Could someone please provide a simple example that shows how to provide access to ../data/ in the example given above?. What are the actual SPECIFIC steps required? BTW, I am running on a LAMP stack.

    Read the article

  • apache domain redirect to subfolder

    - by Dennis
    I have a hosting account with godaddy. Its a linux system running apache. The way they do their setup is your primary domain is the root folder. When you add a subdomain its in a subfolder of the root which sucks. I want to setup a subfolder structure to organize my domains.. I called godday support and they said to use redirects.. but did not know how to do that.. How its setup now: primary domain: www.domain.com / sub.domain.com /sub I want to create a directory structure and then redirect to each but only show www.domain.com in the url www.domain.com /domain/www sub.domain.com /domain/sub I tried using: RewriteEngine On RewriteCond %{HTTP_HOST} ^(www.)?domain.com$ RewriteRule ^(/)?$ domain/www [L] but it just changes the url to www.domain.com/domain/www Can this be done in htaccess?

    Read the article

  • Weird .ASP pages from my non-ASP site generating 404s

    - by Amanda
    In Google Webmaster Tools, I have a huge list of "Not Found" crawl errors (404) with URLs that look like this: http://www.exclusivevillas.co.za/villa_view.asp?vSeq=82&activitySeq=3&page=3, seemingly originating from URLs very similar to that (eg http://www.exclusivevillas.co.za/villa_view.asp?vSeq=82&activitySeq=3&page=4. Thing is, the site is WordPress. Has been for almost a year now. Was plain html before that. I don't know where these ASP requests are coming from. And furthermore, the dates these supposed ASP pages requested these other ASP pages, resulting in 404s, are very recent. What's going on?

    Read the article

  • dynamic urls and links on one web page

    - by John
    I am trying to figure out how to create dynamic links and urls on a static webpage. What I want to do is the following: I have a single webpage for example: MYWEBPAGEdotCOM/INDEX.HTML that will always look the same, except for one link on the page. the link would be on the page for example: LINK TO AFFILIATE: affiliatedotCOM/my-affiliate_code_here_DYNAMIC_REFERER the only thing would change is the "DYNAMIC_REFERER" with every dynamic url on this page: MYWEBPAGEdotCOM/INDEX.PHP_id=test1 MYWEBPAGEdotCOM/INDEX.PHP_id=test2 MYWEBPAGEdotCOM/INDEX.PHP_id=test3 MYWEBPAGEdotCOM/INDEX.PHP_id=test4 which would only hange the dynamic link on the page to: affiliatedotCOM/my-affiliate_code_here_test1 affiliatedotCOM/my-affiliate_code_here_test2 affiliatedotCOM/my-affiliate_code_here_test3 affiliatedotCOM/my-affiliate_code_here_test4 Can someone tell me how I could go about doing this? I just dont want to have to make 100's of pages, as this would prevent me from having to do so.

    Read the article

  • FB login and privacy policy

    - by Ispuk
    I'm bulding up a site, where the ONLY ONE method for log in, is the fb login button. Now i'm wondering if i need to make users check and read the my own site's privacy and policy before and accept that is some way, is this needed? The site is for text/pourpouse sharing, and you can only interact with the site after you are logged/registered, althought you can only navigate lists of users and pourpouses. thanks

    Read the article

  • Blogger still visible after moving to WP; Google Indexing issues after moving from Blogger to WP

    - by Erin
    I recently migrated from Blogger to Wordpress and am having two major transition issues that are really hurting. Despite literally hours of searching and experimenting, I cannot resolve the following: ISSUE ONE: I fixed all of my old blogger links to 301 redirect successfully to my WP links (the 2 structures are different and I realized too late), but my old blogger blog is still sometimes visible! (the 2 designs are completely different) I had 31 hits on my blogger site just yesterday. I have updated my privacy settings to hide my blogger blog from search engines and not be visible on blogger. I also removed my custom domain from blogger already as well. HELP! Not sure how to stop this. ISSUE two: Despite submitting a new site map and reindexing my pages for my WP blog, I am not visible in search engines, although I was very visible previously. In fact, some of my OLD links are showing up. Am I being penalized?? Any thoughts on how to fix. THANK YOU! Erin my site: www.thelawstudentswife.com

    Read the article

  • Apache config file. Redirect permanent gives 403 error

    - by Homunculus Reticulli
    I am changing my domain from foo.com to foobar.org. I used a Redirect permanent in my apache config file, and then restarted apache. When I try to access the old domain foo.com, I get a 403 error. This is what my apache config file looks like: <VirtualHost *:80> ServerName foo.com #ServerAlias www.foo.com #ServerAdmin [email protected] Redirect permanent / http://www.foobar.org/ DocumentRoot /path/to/project/foo/web DirectoryIndex index.php # CustomLog with format nickname LogFormat "%h %l %u %t \"%r\" %>s %b" common CustomLog "|/usr/bin/cronolog /var/log/apache2/%Y%m.foo.access.log" common LogLevel notice ErrorLog "|/usr/bin/cronolog /var/log/apache2/%Y%m.foo.errors.log" <Directory /> Order Deny,Allow Deny from all </Directory> <Files ~ "^\.ht"> Order allow,deny Deny from all </Files> <Directory /path/to/project/foo/web> Options -Indexes -Includes AllowOverride All Allow from All RewriteEngine On # We check if the .html version is here (cacheing) RewriteRule ^$ index.html [QSA] RewriteRule ^([^.])$ $1.html [QSA] RewriteCond %{REQUEST_FILENAME} !-f # No, so we redirect to our front end controller RewriteRule ^(.*)$ index.php [QSA,L] </Directory> <Directory /path/to/project/foo/web/uploads> Options -ExecCGI -FollowSymLinks -Indexes -Includes AllowOverride None php_flag engine off </Directory> Alias /sf /lib/vendor/symfony/symfony-1.3.8/data/web/sf <Directory /lib/vendor/symfony/symfony-1.3.8/data/web/sf> # Alias /sf /lib/vendor/symfony/symfony-1.4.19/data/web/sf # <Directory /lib/vendor/symfony/symfony-1.4.19/data/web/sf> Options -Indexes -Includes AllowOverride All Allow from All </Directory> </VirtualHost> Can anyone spot what I may be doing wrong?. The site foobar.org does exist so I don't know why this error occurs - help?

    Read the article

  • Huge google impression drop after cleaning html

    - by olgatorresfoundation
    Good morning, I am the webmaster of a non-profit organization that donates grants to colorectal cancer research projects and funds various colorectal cancer information campaigns. We have three domains: www.fundacioolgatorres dot org (Catalan) www.fundacionolgatorres dot org (Spanish) www.olgatorresfoundation dot org (English) So what happened? I redesigned olgatorresfoundation on the 20th and the fundacionolgatorres on the 30th of May. In both cases, exactly two days later, the number of impressions on both dropped to a halt. Granted, we did not have the traffic of Microsoft, but a 90% decrease a disaster of incredible proportions for us. My only real changes were cleaning up the old ineffective HTML to a cleaner form (mostly moving away from redundant table construction to a table-less view). Here is a before and after snapshot of what the change looks like: Before: http://www.fundacioolgatorres.org/aparell_digestiu/introduccio/ (unchanged page in Catalan) After: http://www.olgatorresfoundation.org/digestive_system/introduction/ (changed page in English) Anybody has a clue to what just happened? Why should a normal, sane html improvement be punished and so dramatically? No URLs have been changed, neither have page names or descriptions. Possible secondary question: If it is so that Google sees it as a major overhaul and decides to drop the pagerank sharply, does it come back to pre-change levels if the content "checks out" or will the page start over from scratch earning those pagerank points (which would mean that we would have to wait 6 months for the pages to recover to the level they had two weeks ago)? (duplicated from productforums.google dot com/forum/#!category-topic/webmasters/crawling-indexing--ranking/YsnyX0JzOpY, hoping to reach a wider audience)

    Read the article

  • Displaying topics/categories related ads

    - by Frank
    Is there any advertising company that allow webmasters to decide on general ad topics (such as "Entertainment", "Autos and vehicles", "Arts", etc.) to be displayed for a user visiting on a website? I know you can choose specific ad campaigns to show to users, or let the advertising company decide for you which ads to show. But I am looking for an option to ask for the advertsing company to show me ads based on categories/topics. Thanks. Frank

    Read the article

  • Getting PHP to work with apache to run .php files through browser [closed]

    - by Kevin Duke
    I have VPS running Debian 5.0 (I think) and I would like to get it to run PHP files. I was told it needed to be configured with Apache. I tried entering the command apt-get install apache2 php5 libapache2-mod-php5. But there was no change. Console output: http://pastebin.com/sVMWq6mA This is everything in my /etc/apache2/mods-enabled: http://img35.imageshack.us/img35/6474/modsb.jpg My webserver can be accessed here: http://206.217.223.136/test/ In my test.php file I have the code : <?php phpinfo(); ?> but instead of displaying the page, it tries to download it. How can I fix this?

    Read the article

  • Sending Emails via Google SMTP - after some time quit working

    - by Chris
    on a website I use PHPMailer to send automated registration emails, etc and also a newsletter-tool (which loops through the emails and sends them one by one). Also, I configured in Gmail under Settings and confirmed @mydomain addresses, so I can send from @mydomain emails without the gmail address being displayed. Furthermore I authorized the website to send mails with this link: https://accounts.google.com/DisplayUnlockCaptcha Now, after 2 month where everything worked perfectly fine, suddenly users started not to receive emails anymore and most recently emails are not even being sent anymore. Also, I received many error messages like this: Technical details of permanent failure: Google tried to deliver your message, but it was rejected by the recipient domain. We recommend contacting the other email provider for further information about the cause of this error. The error that the other server returned was: 550 550 5.4.1 [email protected]: Recipient address rejected: Access Denied (state 13). When I check at this link: https://toolbox.googleapps.com/apps/checkmx/ It tells 2 none critical errors: Relayhost configuration detected. There SHOULD be a valid SPF record. So, the questions I would have were: does anybody have any hint why it stopped working, what the error messages mean? what to do to fix it? where do I set a SPF record (Cpanel?)? what is a relayhost and how to fix that? It is about 1000-1400 mails a day (gmail's limit is 2000). Also, what can I do wrong when setting up an SPF record? I've heard there are some testing tools for that. Thank you so much already in advance for your help!

    Read the article

  • Need Help With Finding SEO Company/Individual

    - by three3
    Hi everyone, I am fairly new to SEO and I have done all of the tactics and operations that I know to do to help my site rank to the number 1 spot on Google. I know that no one can guarantee the number 1 spot on Google or on any other search engine but I cannot even seem to get my website to the first page of Google's search results. My company is looking to hire a company or individual to work on our SEO. Does anyone here know of an SEO company or individual that has had good results in the past with getting a website to the front of Google, and preferably to the number 1 spot on Google? We are willing to pay a large sum of money for our keywords to rank on the front of Google search results. Any suggestions are welcome. Thanks John

    Read the article

  • Why my public ip address is different when I visit different website? [closed]

    - by Mett Li
    After I connect to web via pppoe, I visit different website's phpinfo() page, but the result of _SERVER["REMOTE_PORT"] is different. Including some ip address lookup site like www.whatismyip.com, www.apnic.net etc. and gmail's ip location lookup list are all different. Why? Is these different ip address allocated by ISP on different web resources ? Maybe the reason is CDN by my ISP or the website? If the reason is the ip address allocated by ISP on different web resources, the website visitor's ip address may unreal.

    Read the article

  • Transfer websites and domains to new server

    - by Albert
    We have currently around 40 websites and 80+ domains/sub-domains in a shared 1&1 hosting package, and we just acquired a managed dedicated server with 1&1 as well. Now it's time to start transferring everything over to the new server. Transferring just the websites and databases wouldn't be a problem, it would take time but it's pretty straight forward. The problem comes when transferring the domains, let me explain why. Many of the websites we have are accessible via sub-domains of a parent domain. Ideally, we would like to transfer the sites one by one, in order to check for each one that everything works fine in the new server. However, since we also need to transfer the domain so it's managed in the new server, once we do that means that all the websites using that domain need to be already in the new server before transferring that domain, thus not allowing the "one by one" philosophy. Another issue is the downtime when transferring the domain, from the moment it stops working in the hosting package and becomes active in the new server. I believe there's nothing we can do here. So my question is if there's any way we can do the "one by one" transferring of the websites (and their corresponding sub-domains) in the circumstances described above. One idea I had would be: 1. Let's say we have website A, which is accessible using subdomain.mydomain.com (and there are many other websites accessible via other sub-domains of mydomain.com) 2. Transfer the files of website A to the new server 3. Point a test domain in the new server to the website A's folder (the new server comes with a "test" domain) 4. Test if website A works with that "test" domain 5. In the old hosting, somehow point the real sub-domain (subdomain.mydomain.com) to the new location of website A, in a way that user always see the same URL as always 6. Repeat 2-5 for every website belonging to the same domain 7. Once all are working in the new server, do the actual transfer of the domain to the new server, and then re-create all the sub-domains and point them to their corresponding website That way, users wouldn't notice that there's been a change (except for a small down time of the websites when doing the domain transfer). The part I'm not sure about is point 5 of the above. Is there any way to do that? I mean do it in a way that users see the original domain all the time in their browser, even for internal pages (so not only for the "home page", which would be sub-domain.mydomain.com, but also for example for the contact page, which would be sub-domain.mydomain.com/contact.php). Is there any way to do this? Or are we SOL and we're going to have to transfer all at the same time?

    Read the article

  • Showing schema.org aggregate rating in Google rich snippets

    - by nickh
    After adding schema.org microdata markup for reviews and aggregate ratings, I expected review and rating information to show up in rich snippets. Unfortunately, neither are being shown. Google's Structured Data Testing Tool finds the microdata, and there're no errors or warnings on the page. Any idea what's wrong with the microdata markup? Example 1: Live Page: http://www.shelflife.net/ljn-thundercats-series-3/bengali Google Test: http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.shelflife.net%2Fljn-thundercats-series-3%2Fbengali Example 2: Live Page: http://www.shelflife.net/star-wars-mighty-muggs/asajj-ventress Google Test: http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.shelflife.net%2Fstar-wars-mighty-muggs%2Fasajj-ventress

    Read the article

  • Web application stopped behaving normally after migration from Dedicated servers to Ec2 servers

    - by sunny
    Web application stopped behaving normally after migration from Dedicated servers to Ec2 servers Old Dedicated Server configuration systemtype: 32 bit operating system. RAM: 3.99 Gb Ram Processer : 1.86 GHZ New Servers in EC2 systemtype: 64 bit operating system. RAM: 3.99 Gb Ram Processer : 2.73 GHZ - 2.31 GHZ Everthing is working fine in our production server. But as we migrated our web application from old servers to new servers and transferred the entire network traffice to new servers.Site suddenly stopped behaving abnormally. Sometimes it's super fast Some times slow. sometimes normal some times super slow and sometimes no response This all above happens with a time interval or around 2 - 3 minutes. This went on happening 8 - 10 hours. Few differences in old and new servers are Old servers are using II6 and New servers are using IIS 7.5. We are using exactly the same code in the old and new servers. Even the ec2 servers are having higher CPU then older servers but still having lower. But not sure how this is happned. Please suggets your views...

    Read the article

  • What bots are really worth letting onto a site?

    - by blunders
    Having written a number of bots, and seen the massive amounts of random bots that happen to crawl a site, I am wondering if the goal of the site allowing bots is for the potential for the bot to send real traffic back to the site if there is any reason to allow bots that are not known to be sending real traffic back, and how to spot these "good" bots; based on how they ID themselves, IPs they come from, behaviors, etc.

    Read the article

  • Fixed JavaScript Warning - Pin to Top of Page Using CSS Position [migrated]

    - by nicorellius
    I am new to this site, but it seems like the right place to ask this question. I am working on a noscript chunk of code whereby I do some stuff that includes a <p> at the top of the page that alerts the users that he/she has JavaScript disabled. The end result should look like the Stack Exchange sites when JavaScript is disabled (here is a screenshot of mine - SE looks similar except it is at the very top of the page): I have it working OK, but I would love it if the red bar stayed fixed along the top, upon scrolling. I tried using the position: fixed; method, but it ends up moving the p element and I can't get it to look exactly the same as it does without the position: fixed; modification. I tried fiddling with CSS top and left and other positioning but it doesn't ever look like I want it to. Here is a CSS snippett: <noscript> <style type="text/css"> p. noscript_warning { position: fixed; } </noscript>

    Read the article

  • Host And Expose Application to local small network

    - by tartak
    I developed a little application (web application) using JavaEE+MySql. I try to keep some data and .. from time to time to get some reports using my data. My problem is I have to access this application from 4-5 computers in the office. They are connected through a switch. It's a typical small office network, nothing fancy. I need some advice on how to do this. I mean for a small application with no external communication is it mandatory to use an Apache machine? I'd use a simple Tomcat container on the "server machine" (which is my computer, a windows machine) and .. basically .. I would like to permit the access to my colleagues also. I don't have any knowledge about concurrency (I know mysql permits concurrent access) so I would like some configuration tips also.

    Read the article

  • XAMPP - Unable to serve files larger than ~30MB [on hold]

    - by Sparx401
    I'm developing a site locally with XAMPP on Windows 7, and as far as media is concerned, I'm unable to play media files that are larger than 30MB or so. Both video and audio files (MP4 and MP3 respectively) generate this error in Chrome (and show similar errors in other browsers such as IE9 and Opera): No data received Unable to load the webpage because the server sent no data. Error 324 (net::ERR_EMPTY_RESPONSE): The server closed the connection without sending any data. It seems that the exact number of MB somewhat varies between browsers though. One video in question is 34MB and actually plays in Opera and IE9, but gives the aforementioned error in Chrome. I've checked to make sure the file paths were typed correctly and ensured that the directive for .htaccess is there to serve MP4s: AddType video/mp4 mp4 Also, I have these directives set as well in the same .htaccess file: php_value upload_max_filesize "80M" php_value post_max_size "80M" php_value max_input_time 60 php_value max_execution_time 60 And memory_limit is set in php.ini as "128M" so I'm left wondering: what is causing my files to not play, and what, if any, directives I have to change on the server-side? Perhaps something to do with limitations with the GET method (the method I'm seeing on Chrome's network tab among other header request/response info)?

    Read the article

< Previous Page | 67 68 69 70 71 72 73 74 75 76 77 78  | Next Page >