Search Results

Search found 2454 results on 99 pages for 'domains'.

Page 76/99 | < Previous Page | 72 73 74 75 76 77 78 79 80 81 82 83  | Next Page >

  • setting a cookie in php

    - by Jacksta
    I am trying to set a cookie, whas wrong with this as I am getting an error. Warning: setcookie() expects parameter 3 to be long, string given in /home/admin/domains/domain.com.au/public_html/setcookie.php on line 6 <?php $cookie_name = "test_cookie"; $cookie_value = "test_string"; $cookie_expire = "time()+86400"; $cookie_domain = "localhost"; setcookie($cookie_name, $cookis_value, $cookie_expire, "/", $cookie_domain, 0); ?> <HTM> <HEAD> </HEAD> <BODY> <h1>cookie mmmmmmm</h1> </BODY> </HTML>

    Read the article

  • Tricking the server to load files faster?

    - by Yongho
    If we have a website with multiple images and videos, I've read that it's best to serve them from other domains so that the browser can simultaneously download a bunch of files, rather than waiting one by one for each file to be downloaded. For example, if we have a website http://example.com/, we might consider serving: Videos from http://video.example.com/ Images from http://images.example.com/ etc. Question: can we achieve the simultaneous downloading by tricking the browser into believing that the files are hosted there, or do they actually need to be at that location? We can, for example, pretend to serve video from http://video.example.com/ when actually it's just a clever htaccess rewrite that ACTUALLY serves from http://example.com/video.php. In this case, the video is being served from the main domain but because we refer it as http://video.example.com/, it may think that it's another domain and thus load files simultaneously, rather than one by one. Is this feasible?

    Read the article

  • RST packet sent by the server

    - by intoTHEwild
    I am developing a client in Flash and using http req/resp to communicate with the server. For a while the session works fine and then the connection is terminated by the server. I did a wireshark sniff at the server and the last message which it sends is a RST packet. Also it happens only when I'm using IE and the server and client are in different domains. This does not happen in FireFox. I have been struggling to find a sol, till I found this thread. It's a bit old but I hope I could get some help. I am not sure if this bit of info is important but I am connecting to the server via a gateway. Any clue or suggestions for where should I look into to locate the problem ?

    Read the article

  • How to Redirect Subdomains to Other Domain

    - by Codex73
    What I'm trying to accomplish with htaccess mod-rewrite: Redirect all sub-domains to new domain name w rewrite rule. e.g. test1.olddomain.com === test1.newdomain.com test2.olddomain.com === test2.newdomain.com test3.olddomain.com === test3.newdomain.com This is what I have so far which of course is wrong: Options +FollowSymLinks RewriteEngine on RewriteCond %{HTTP_HOST} ^olddomain\.com$ [NC] RewriteRule ^(.*)$ http://www.newdomain.com/$1 [R=301,L] RewriteCond %{HTTP_HOST} ^www\.olddomain\.com$ [NC] RewriteRule ^(.*) http://www.newdomain.com/$1 [R=301,L] RewriteRule [a-zA-Z]+\.olddomain.com$ http://$1.newdomain.com/ [R=301,L] Since I'm not a Regular Expression junkie just yet, I need your help... Thanks for any help you can give here. I know also we can compile these first two conditions into one. Note: The reason I don't redirect all domain using DNS is that a lot of directories need special rewrite rules in order to maintain positions on SEO.

    Read the article

  • Map a domain to an MVC area

    - by Simon_Weaver
    Anybody got any experience in mapping a domain to an MVC area? Here's our situation: Old system (still active but will soon redirect to new store): www.example.com - our main site where we send traffic store.example.com - our store site which is a completely separate site that is indexed in google New system: www.example.com - same site as before www.example.com/store - new store site - built in an ASP.NET MVC area Because store is a separate domain google gives it a separate entry in the search results. I'd like to keep this benefit in future but wondering whether or not there is a good way to map a domain (store.example.com) to the MVC area or if its just going to be more trouble than its worth. PS. I'm not trying to keep existing indexing - its a completely separate store so thats not possible. I just want to redirect to the corresponding page in the new store. I'm just trying not to lose the benefit of two domains for SEO purposes.

    Read the article

  • How to make Universal Feed Parser only parse feeds?

    - by piquadrat
    I'm trying to get content from external feeds on my Django web site with Universal Feed Parser. I want to have some user error handling, e.g. if the user supplies a URL that is not a feed. When I tried how feedparser responds to faulty input, I was surprised to see that feedparser does not throw any Exceptions at all. E.g. on HTML content, it tries to parse some information from the HTML code, and on non-existing domains, it returns a mostly empty dictionary: {'bozo': 1, 'bozo_exception': URLError(gaierror(-2, 'Name or service not known'),), 'encoding': 'utf-8', 'entries': [], 'feed': {}, 'version': None} Other faulty input manifest themselves in the status_code or the namespaces values in the returned dictionary. So, what's the best approach to have sane error checking without resorting to an endless cascade of if .. elif .. elif ...?

    Read the article

  • How to organise a php based website

    - by bsandrabr
    I am putting my php /mysql website up and this is my scenario The users are grouped into sites each site with their own unique database. There will be about 40 users per site. the two options I'm trying to decide between are have a central website running the php and directing the users off to their own database using sub domains for each user each with their own php in htdocs I dont even know if 2 is possible/stupid but if it was, would it make any difference to performance as they're all being run by the same server. Any other ideas/ advice much appreciated as I want to organise it the best way from the start

    Read the article

  • Parsing html for domain links

    - by Hallik
    I have a script that parses an html page for all the links within it. I am getting all of them fine, but I have a list of domains I want to compare it against. So a sample list contains list=['www.domain.com', 'sub.domain.com'] But I may have a list of links that look like http://domain.com http://sub.domain.com/some/other/page I can strip off the http:// just fine, but in the two example links I just posted, they both should match. The first I would like to match against the www.domain.com, and the second, I would like to match against the subdomain in the list. Right now I am using url2lib for parsing the html. What are my options in completely this task?

    Read the article

  • Elements are listed in a vob but not able to checkout/checkin in CCRC

    - by sunil devan
    Hi, There are 2 windows domains named as OPR & BDC. In OPR domain the CCRC server is hosted, users accessing from BDC domain can able to connect to CCRC and list the vob....and also able to join the project. To perform any checkout/checkin/loading resources it is taking long time and after a day it is in same state.Connectivity is fine to OPR domain from BDC domain ( ping & tracrt is working fine) . Could you please let me know if you have some idea about it? Thanks, Sunil

    Read the article

  • Find Loding performance of the Website

    - by pandora
    How to find the site performance, there is a tools like YSLOW, Speed traker in google that shows the speed of the website. I have done a php project on LMS with Zend Framework, Everything is in live. When user post contents for a subject that may be size 200K and submitted to the server takes too slow. Sometime server may get DOWN. I login to server(PUTTY) and checked i found that there is more resource occupied in my server. It uses full memory on the server. When i cleared the resource the site loads well. Site is in Dedicated server with 3 more domains with 4GB Ram. Because of this LMS website all the website gets down. I need to check what is wrong in my website. How do i Start?

    Read the article

  • Why does my program occasionally segfault when out of memory rather than throwing std::bad_alloc?

    - by Bradford Larsen
    I have a program that implements several heuristic search algorithms and several domains, designed to experimentally evaluate the various algorithms. The program is written in C++, built using the GNU toolchain, and run on a 64-bit Ubuntu system. When I run my experiments, I use bash's ulimit command to limit the amount of virtual memory the process can use, so that my test system does not start swapping. Certain algorithm/test instance combinations hit the memory limit I have defined. Most of the time, the program throws an std::bad_alloc exception, which is printed by the default handler, at which point the program terminates. Occasionally, rather than this happening, the program simply segfaults. Why does my program occasionally segfault when out of memory, rather than reporting an unhandled std::bad_alloc and terminating?

    Read the article

  • How to gather usage statistics for iPhone app?

    - by FX
    I am in the process of releasing my first iPhone app. It's a simple utility, I'd just like to gauge the release process, app lifetime and trends, just so it can help make more realistic choices in future apps. I think it would be nice to have usage statistics in addition to download stats from Apple. For example, how many times is the app opened by each user, what iPhone OS version do they have, etc. I think some of it would simply be to try and connect to a known URL on one of my domains, passing it anonymous information (let's say, connect to http://mydomain.net/stats?app=myApp&version=1.0.0&os=3.1.2&used=18). My questions are: is that forbidden in any way by Apple's rules? (none that I could find, at least) does that seem reasonable to you? are there existing frameworks that would do that simpler/better that writing my own code?

    Read the article

  • 2 sites each in a different country with 1 set of content (cloaking)

    - by Greg
    Hi, I have a question re: cloaking. I have a friend who has a business in Canada and the UK. Currently the .ca site is hosted on Godaddy. The co.uk domain is registered (with uk ip address) with domainmonster and is using a cloaked/framed redirect to the .ca site. As a result (my assumption) the .ca site is indexed fine by google, the .co.uk is not. The content is generic for both sites. How do I point the .co.uk site directly to the content independently (preferably without duplicating the content hosting in the UK), so that for instance if the .ca domain was taken away altogether the .co.uk domain would remain an entity in itself from Google's point of view? Does Google index a generic set of content and then associate different country domains with that content? I hope I have explained this ok. Thanks, Greg

    Read the article

  • accessing and modifying tab opened using window.open in google chrome

    - by sonofdelphi
    I used to be able to this to create an exported HTML page containing some data. But the code is not working with the latest version of Google Chrome (It works alright with Chrome 5.0.307.11 beta and all other major browsers). function createExport(text) { var target = window.open(); target.title = 'Memonaut - Exported View'; target.document.open(); target.document.write(text); target.document.close(); } Chrome now complains that the domains don't match and disallows the Javascript calls as unsafe. How can I access and modify the document of a newly opened browser-tab in such a scenario?

    Read the article

  • Is it possible to prevent a locally-running SWF (AS3) from downloading from my website?

    - by Matt
    I've got a crossdomain.xml file which allows SWFs running on only a certain few domains to download resources from my domain. However, one simple way around this is for a user to download the SWF to their local machine, and run it there (i.e. by double-clicking on it within Windows Explorer, not by running through http://localhost). It seems that when this happens, the crossdomain.xml file is ignored. I understand that in my actionscript, I can do this: if (Security.sandboxType.indexOf(Security.REMOTE) == -1) // running locally - don't allow However it is incredibly easy for someone to decompile the SWF and simply remove this line. Is it possible to do something on the server side to stop a locally running SWF to download from my site? I tried checking the referrer but this field often isn't populated. Does anyone have any other ideas? Thanks, Matt

    Read the article

  • Domain migration - 301 Redirect of all contentes of directory)

    - by Trufa
    Hi, I would like to know if it is possible to do the following considering that I would like to migrate domains. I have lets say: one.com/files/one.html one.com/files/two.php one.com/other/three.html one.com/other/four.doc one.com/other/subdirectory/five.doc I am migrating to two.com So I would like to make RESPECTIVE 301 redirects to the following: two.com/old/files/one.html two.com/old/files/two.php two.com/old/other/three.html two.com/old/other/four.doc two.com/old/other/subdirectory/five.doc I've tried with cPanel and although I come "close" with the redirects option I can't seem to make it happen. The folders are not much (10 -12) the file are a lot, and obviously impossible to make it manually. How would you proceed? Can this/ should this be done with regex from the .htaccess?? Can you direct all the elements of a subdirectory in the manner expressed above? I hope the question is clear enough, if not please ask for any clarification needed!! Thanks in advance!!

    Read the article

  • DNS resolution problems; dig SERVFAIL error

    - by JustinP
    I'm setting up a couple of dedicated servers, and having problems setting up my nameservers properly. One of these is a LEMP server (LAMP with nginx in place of Apache), and the other will function solely as an email server, running exim/dovecot/ASSP antispam (no Apache). The LEMP server is CentOS 5.5, with no control panel, while the email server is CentOS 5.5 as well, with cPanel/WHM. So, I've had problems getting DNS set up properly. I have two domains, each one pointing to one of these servers. The nameservers are registered correctly with the domain registrar, and the nameserver IPs are entered correctly as well. I've spoken to tech support at the registrar and they confirm that everything is set up on their end. Not knowing much about DNS, I googled nameservers and DNS until I nearly went blind, and spent hours messing with the configuration. Eventually, I got the LEMP server's DNS working properly (no cPanel). Pleased with this triumph, I'm trying to mimic that configuration and repeat the process with the email server, and it's just not happening. The nameserver starts and stops, but the domain doesn't resolve. Things I have tried Going through standard procedures to set up DNS in WHM Clearing all DNS information, uninstalling BIND, then reinstalling all of that and again going through WHM procedures for setting up DNS Clearing all DNS information, and setting up BIND via shell (completely outside of cPanel) by using my config and zone files from the LEMP server as a template named runs just fine, but nothing is resolving. When I "dig any example.com" I get a SERVFAIL message. Nslookups return no information. Here are my config and zone files. named.conf controls { inet 127.0.0.1 allow { localhost; } keys { coretext-key; }; }; options { listen-on port 53 { any; }; listen-on-v6 port 53 { ::1; }; directory "/var/named"; dump-file "/var/named/data/cache_dump.db"; statistics-file "/var/named/data/named_stats.txt"; memstatistics-file "/var/named/data/named_mem_stats.txt"; // Those options should be used carefully because they disable port // randomization // query-source port 53; // query-source-v6 port 53; allow-query { any; }; allow-query-cache { any; }; }; logging { channel default_debug { file "data/named.run"; severity dynamic; }; }; view "localhost_resolver" { match-clients { 127.0.0.0/24; }; match-destinations { localhost; }; recursion yes; //zone "." IN { // type hint; // file "/var/named/named.ca"; //}; include "/etc/named.rfc1912.zones"; }; view "internal" { /* This view will contain zones you want to serve only to "internal" clients that connect via your directly attached LAN interfaces - "localnets" . */ match-clients { localnets; }; match-destinations { localnets; }; recursion yes; zone "." IN { type hint; file "/var/named/named.ca"; }; // include "/var/named/named.rfc1912.zones"; // you should not serve your rfc1912 names to non-localhost clients. // These are your "authoritative" internal zones, and would probably // also be included in the "localhost_resolver" view above : zone "example.com" { type master; file "data/db.example.com"; }; zone "3.2.1.in-addr.arpa" { type master; file "data/db.1.2.3"; }; }; view "external" { /* This view will contain zones you want to serve only to "external" clients * that have addresses that are not on your directly attached LAN interface subnets: */ match-clients { any; }; match-destinations { any; }; recursion no; // you'd probably want to deny recursion to external clients, so you don't // end up providing free DNS service to all takers allow-query-cache { none; }; // Disable lookups for any cached data and root hints // all views must contain the root hints zone: //include "/etc/named.rfc1912.zones"; zone "." IN { type hint; file "/var/named/named.ca"; }; zone "example.com" { type master; file "data/db.example.com"; }; zone "3.2.1.in-addr.arpa" { type master; file "data/db.1.2.3"; }; }; include "/etc/rndc.key"; db.example.com $TTL 1D ; ; Zone file for example.com ; ; Mandatory minimum for a working domain ; @ IN SOA ns1.example.com. contact.example.com. ( 2011042905 ; serial 8H ; refresh 2H ; retry 4W ; expire 1D ; default_ttl ) NS ns1.example.com. NS ns2.example.com. ns1 A 1.2.3.4 ns2 A 1.2.3.5 example.com. A 1.2.3.4 localhost A 127.0.0.1 www CNAME example.com. mail CNAME example.com. ; db.1.2.3 $TTL 1D $ORIGIN 3.2.1.in-addr.arpa. @ IN SOA ns1.example.com contact.example.com. ( 2011042908 ; 8H ; 2H ; 4W ; 1D ; ) NS ns1.example.com. NS ns2.example.com. 4 PTR hostname.example.com. 5 PTR hostname.example.com. ; Also of note: both of these servers are managed. Tech support is very responsive, and largely useless. Hours go by with them asking me questions to narrow down what could be wrong, then they pass the ticket to the tech on the next shift, who ignores everything that's happened already and spend his whole shift asking all the same questions the last guy asked. So, in summary: *Nameservers, with IPs, are correctly registered with domain registrar *named is configured and running *...and must not be configured correctly, because nothing resolves. Any help would be great. I changed domains and IPs in the files to generics, but let me know if you need to know the domain in question. Thanks! UPDATE I found that I didn't have 127.0.0.1 in /etc/resolv.conf, so I added it, along with my two public IPs that I have named listening on. resolv.conf search www.example.com example.com nameserver 127.0.0.1 nameserver 7.8.9.10 ;Was in here by default, authoritative nameserver of hosting company nameserver 1.2.3.4 ;Public IP #1 nameserver 1.2.3.5 ;Public IP #2 Now when I DIG example.com from the host, it resolves. If I try to DIG from my other server (in the same datacenter), or from the internet, it times out or I get SERVFAIL.

    Read the article

  • Should I use a hosted version of JQuery? Which one?

    - by ataylor
    Should I use a local copy of jquery, or should I link to a copy provided by Google or Microsoft? I'm primarily concerned about speed. I've heard that just pulling content from other domains can have performance advantages related to how browsers limit connections. In particular, has anyone benchmarked the speed and latency of Google vs. Microsoft vs. local? Also, do I have to agree to any conditions or licenses to link from a third-party?

    Read the article

  • How do I let a user sign in from a different domain on Authlogic?

    - by Newy
    [This is slightly different than a previous question about having multiple domains share the same cookie. It seemed like there wasn't an easy way to do that.] I have a application at application.com. A customer has app.customer.com pointed at my site on Heroku, and I have everything set up so that it renders a specific version of app correctly. The issue is that I want a user at app.customer.com to be able to login. I believe authlogic is now setting the cookie on application.com, so while it verifies the credentials, no session on customer.com is ever created.

    Read the article

  • Good open source analytics/stats software in PHP?

    - by makeee
    The url shortening service I'm building needs to display some basic click stats to users: # of clicks, conversions, referring domains, and country (filterable by a date range). I'll possibly want more advanced stats in the future. Is there existing open source software that will allow me to pass events to it and then easily display a bar or line graph of that event (for example, a line graph of "conversions" between two specified dates). It seems like something like this should exist and would be much easier then building the whole thing from scratch. I know there are graphing scripts, but that still requires me to format the data (usually as an xml file) and then pass it to the graph. I'm looking for something a bit more complete, which I can just feed the events and then it does everything else.

    Read the article

  • Modules and custom routes

    - by Dennis Haarbrink
    I'm building a website using Zend Framework and having trouble implementing modules and custom routes. There are basically two rules: Select a module based on the domain (multiple domains can select a single module) Regardless of domain, select one specific module based on path Examples: domain1.com selects module domain1 domain1.net selects module domain1 domain2.com selects module domain2 both domain1.com/admin and domain2.com/admin select module admin This is the first project where I use ZF, so my experience with the framework is basically non-existent. I have done some dirty hacking in my bootstrapper where I check the domain and than execute Zend_Layout::startMVC() to get the correct layout, but that is messed up when I'm implementing custom routes. So I was wondering what is the best way to go about implementing this?

    Read the article

  • mysql_connect()

    - by Jacksta
    I am trying to connect to mysql and am getting an error. I put my servers ip address in and used port 3306 whihch post should be used? <?php $connection = mysql_connect("serer.ip:port", "user", "pass") or die(mysql_error()); if ($connection) {$msg = "success";} ?> <html> <head> </head> <body> <? echo "$msg"; ?> </body> </html> Here is the error its producing Warning: mysql_connect() [function.mysql-connect]: Access denied for user 'admin'@'server1.myserver.com' (using password: YES) in /home/admin/domains/mydomain.com.au/public_html/db_connect.php on line 3 Access denied for user 'admin'@'server1.myserver.com' (using password: YES)

    Read the article

  • Rewrite all URL requests to https://www.example.com/$1

    - by xylar
    I have two domains, example.com and example.co.uk, that use the same application on my server. I would like to rewrite the address of the URL depending on what the user types in. The only URLs I want are https://www.example.com and https://www.example.co.uk In my .htaccess file I have the following: # Turn on URL rewriting RewriteEngine On RewriteCond %{HTTP_HOST} ^example\.co.uk$ [NC] RewriteRule ^(.*)$ https://www.example.co.uk/$1 [L,R=301] RewriteCond %{HTTP_HOST} ^example\.com$ [NC] RewriteRule ^(.*)$ https://www.example.com/$1 [L,R=301] If I goto http://www.example.com it doesn't add the https, if I goto http://example.com it does. What is the best way of making the ReWriteCond match the www url?

    Read the article

  • Session being reset when using sub-domain.

    - by Adam Witko
    Hi, I'm trying to use sub-domains in my ASP.NET website but I'm coming across a few problems with the session being reset. I've edited my hosts file to have 'localhost', 'one.localhost' and 'two.localhost'. I can go to any of these URLs and do what I need to do and login to my system. The session mode is defined as follows in the web.config: I'm using SQLServer as the website will be ran as a webfarm. What I'm finding is when I click something that causes a postback all the session is lost and a new session id is created, when this occurs my website is now 'localhost' rather than the logged in 'one.localhost' for example. Does anyone know what might be causing this? Cheers

    Read the article

  • Problem configuring application specific loggin glassfish v3

    - by Shane
    I am using java.util.logging in an EJB application running on glassfish v3. I can see the log messages in server.log but i don't seem to be able to configure the logging level in glassfish\domains\domain1\logging.properties. If I use: Logger logger = Logger.getLogger("com.foo"); To obtain the logger and log with: logger.info("message"); then I expect that if I set com.foo.level=WARNING in logging.properties then the message should not logged. Am I doing something wrong here?

    Read the article

< Previous Page | 72 73 74 75 76 77 78 79 80 81 82 83  | Next Page >