Search Results

Search found 7555 results on 303 pages for 'sites'.

Page 41/303 | < Previous Page | 37 38 39 40 41 42 43 44 45 46 47 48  | Next Page >

  • Design a web site control panel/service

    - by HasanGursoy
    Hi, I'm a web designer and I've designed many web sites. Most of the web sites are coded with asp.net. I want to control this web sites from my database and even if ftp account and server is changed I want to be able to close, deactivate site. Which method do you suggest. Simple ajax control on page loads or service control from my web site on application_start. Also I mostly pre-compile web sites. Any suggestions are welcome.

    Read the article

  • Agile Web Development

    - by sidcom
    Hi all Im looking for some resources and information around agile web development. I have done a search and found a wiki page and lots of other sites around the subject. Most of these sites are orientated around Ruby on Rails. Does anyone know of any sites or resources that cover other platforms and languages like asp.net and php or are even generic. Thanks

    Read the article

  • arboroaks.net/lakelandhills verse lakelandhillsatarboroaks.com , which is best for SEO?

    - by Roeland
    I am trying to decide what is the best way to approach a site I built with SEO in mind. The site has a parent site (sort of a splash page) (arboroaks.net) and the 3 children sites. Parent site is one page, and each of the 3 child sites is about 8-10pages. Right now I have the 3 child sites set up as folders under arboroaks.net. For example, lakelandhills, a child site, would be arboroaks.net/lakelandhills. I have the full domain, arboroaksatlakelandhills.com redirect to this url (arboroaks.net/lakelandhills). My question is whether I should have the child sites be contained on their own domain or not. Think lakelandhillsatarboroaks.com/about-us.php verse arboroaks.net/lakelandhills/about-us.php. The main reason is obviously for SEO consideration. Thanks!

    Read the article

  • Web site backup in PHP?

    - by Pekka
    Does anybody know a clean PHP-based solution that can backup remote web sites using FTP? Must haves: Recursive FTP backups Possible to run through a cron job Easy to configure (easy adding of multiple sites) Local storage of backup files is sufficient Would be nice: Backed up sites are stored as zip files A nice interface to manage things Provides notification when backup has succeeded or failed Does incremental backups Does MySQL Database backups I need this to be PHP (or Perl) based because it's going to be used on shared hosting packages that do not allow usage of the standard GNU/Linux tools available.

    Read the article

  • Subversion Problem on Mac OS X

    - by Mohsin Jimmy
    This exists in my httpd.conf file: <Location /svn> DAV svn SVNParentPath /Users/iirp/Sites/svn Allow from all #AuthType Basic #AuthName "Subversion repository" #AuthUserFile /Users/iirp/Sites/svn-auth-file #Require valid-user </Location> This is working file When I change this to: <Location /svn> DAV svn SVNParentPath /Users/iirp/Sites/svn #Allow from all AuthType Basic AuthName "Subversion repository" AuthUserFile /Users/iirp/Sites/svn-auth-file Require valid-user </Location> and when I access my repository through URL, it gives me the authentication screen but after that screen my svn repository is not showing up correctly. to see message that it gives to me is: Internal Server Error The server encountered an internal error or misconfiguration and was unable to complete your request. Please contact the server administrator, [email protected] and inform them of the time the error occurred, and anything you might have done that may have caused the error. More information about this error may be available in the server error log.

    Read the article

  • What are some ways to identify a logged in user on the web?

    - by farinspace
    Here is the scenario: There are 5 websites (different domain names) that need to share a session. I am using a bit of code on each site which returns a "blank.gif" image and at the same time sets the session (syncing it up to the current session). Each of the sites calls a session-img from each of the other sites. Also, all sites have access to the same database (where the session is stored). This works great on FF and Chrome, but not on IE (or Safari PC)... I need to come up with an alternative method to keep a session active? The app is a small custom CMS, so really only 2-3 people will be using it. I can probably identify user logins by IP and then continue to check for the IP accross all sites... Is there something more granular such as a computer uuid that i can check for?

    Read the article

  • How do you check the presence of many keys in a Python dictinary?

    - by Thierry Lam
    I have the following dictionary: sites = { 'stackoverflow': 1, 'superuser': 2, 'meta': 3, 'serverfault': 4, 'mathoverflow': 5 } To check if there are more than one key available in the above dictionary, I will do something like: 'stackoverflow' in sites and 'serverfault' in sites The above is maintainable with only 2 key lookups. Is there a better way to handle checking a large number of keys in a very big dictionary?

    Read the article

  • domain name vs ip address, same server, but different speed

    - by bn
    I have two similar sites: - two of them have almost exactly the same codes, and running on the same server - both sites are the same, they just use different language. - database of the slower site is populated (maybe only the user table) the other tables for site content is the same - the faster uses root to access database one of the sites is not released yet, so it uses IP Address to access the site instead of domain name the site that is using IP address is faster (lot faster) the site that is using domain name is slower do you know why is this happening what could be the reason?

    Read the article

  • How can I pull multiple rows from a MySQL table and use all of them automatically for the same thing

    - by Rob
    Basically, I have multiple URL's stored in a MySQL table. I want to pull those URLs from the table and have cURL connect to all of them. Currently I've been storing the URL's in the local script, but I've added a new page that I can add and remove them from the database, and I'd like the page to reflect it appropriately. Here is what I currently have: $sites[0]['url'] = "http://example0.com "; $sites[1]['url'] = "http://example1.com"; $sites[2]['url'] = "http://example2.com"; $sites[3]['url'] = "http://example3.com"; foreach($sites as $s) { // Now for some cURL to run it. $ch = curl_init($s['url']); //load the urls and send GET data curl_setopt($ch, CURLOPT_TIMEOUT, 2); //No need to wait for it to load. Execute it and go. curl_exec($ch); //Execute curl_close($ch); //Close it off } Now I assume it can't be too amazingly difficult to do, I just don't know how. So if you could point me in the right direction, I'd be grateful. But if you supply me with some code, please comment it appropriately so that I can understand what each line is doing.

    Read the article

  • Any impact of restart OWSTIMER every hour?

    - by Khun
    I found OWSTIMER consume a lot of memory during create personal sites. (I have to pre-create personal sites for many users) After googling I found some suggestion to restart OWSTIMER but it’ll grow up again after create several personal sites. So I have to restart OWSTIMER every hour. Did you know any impact of restart OWSTIMER every hour? Thank you

    Read the article

  • PHP/MySQL Problem

    - by Scott
    Why does this only print the sites specific content under the first site, and doesn't do it for the other 2? <?php echo 'NPSIN Data will be here soon!'; // connect to DB $dbhost = 'localhost'; $dbuser = 'root'; $dbpass = 'root'; $conn = mysql_connect($dbhost, $dbuser, $dbpass) or die ('Error connecting to DB'); $dbname = 'npsin'; mysql_select_db($dbname); // get number of sites $query = 'select count(*) from sites'; $result = mysql_query($query) or die ('Query failed: ' . mysql_error()); $resultArray = mysql_fetch_array($result); $numSites = $resultArray[0]; echo "<br><br>"; // get all sites $query = 'select site_name from sites'; $result = mysql_query($query); // get site content $query2 = 'select content_name, site_id from content'; $result2 = mysql_query($query2); // get site files // print info $count = 1; while ($row = mysql_fetch_array($result, MYSQL_NUM)) { echo "Site $count: "; echo "$row[0]"; echo "<br>"; $contentCount = 1; while ($row2 = mysql_fetch_array($result2, MYSQL_NUM)) { $id = $row2[1]; if ($id == ($count - 1)) { echo "Content $contentCount: "; echo "$row2[0]"; echo "<br>"; } $contentCount++; } $count++; } ?>

    Read the article

  • Restricting IFRAME access in PHP

    - by m0j0
    I am creating a small web page using PHP that will be accessed as an IFRAME from a couple of sites. I'm wanting to restrict access to this site to work ONLY within the "approved" sites, and not other sites or accessed directly. Does anyone have any suggestions? Is this even possible? The PHP site will be Apache, and the sites iframing the content will probably be .NET. Just to clarify, any site can view the page, as long as it's iframe'd within an approved site. I want to block people from accessing it directly. I'm thinking cookies might be a solution, but I'm not sure.

    Read the article

  • Multisite Enabling a Table

    - by Joe Fitzgibbons
    I am creating a table (table A) that will have a number of columns(of course) and there will be another table (table B) that holds metadata associated to rows in table A. I am working with a multi site implementation that has one database for the whole shabang. Rows in table A could belong to any number of sites but must belong to at least one. The problem I have is I am not sure what the best practice is for defining what site each row in table A belongs to. I want performance and scalability. There is no finite number of sites going forward. Rows in table A could belong to any number of sites in the future. Right now there are only 3. My initial thoughts are to have a primary site ID in Table A and then metadata in table B will have rows defining additional sites as needed. Another thought is to have a column in Table A for each site and it is a boolean as to wether it belongs to that site. Lastly I have thought about having another table to map rows in Table A to each site. What is the best way to associate rows in a table with any number of sites with performance and scalability in mind?

    Read the article

  • phpmyadmin login redirect fails with custom ssl port

    - by baraboom
    The server is running Ubuntu 10.10, Apache 2.2.16, PHP 5.3.3-1ubuntu9.3, phpMyAdmin 3.3.7deb5build0.10.10.1. Since this same server is also running Zimbra on port 443, I've configured apache to serve SSL on port 81. So far, I have one CMS script running on this virtual host successfully. However, when I access /phpmyadmin (set up with the default alias) on my custom ssl port and submit the login form, I am redirected to http://vhost.domain.com:81/index.php?TOKEN=foo (note the http:// instead of the https:// that the login url was using). This generates an Error 400 Bad Request complaining about "speaking plain HTTP to an SSL-enabled server port." I can then manually change the http:// to https:// in the URL and use phpmyadmin as expected. I was annoyed enough to spend an hour trying to fix it and now even more annoyed that I cannot figure it out. I've tried various things, including: Adding $cfg['PmaAbsoluteUri'] = 'https://vhost.domain.com:81/phpmyadmin/'; to the /usr/share/phpmyadmin/config.inc.php file but this did not correct the problem (even though /usr/share/phpmyadmin/libraries/auth/cookie.auth.lib.php looks like it should honor it and use it as the redirect). Adding $cfg['ForceSSL'] = 1; to the same config.inc.php but then apache spirals into an infinite redirect. Adding a rewrite rule to the vhost-ssl conf file in apache but I was unable to figure out the condition to use when http:// was present along with the correct ssl port of :81. Lots of googling. Here are the relevant Apache configuration pieces: /etc/apache2/ports.conf <IfModule mod_ssl.c> NameVirtualHost *:81 Listen 81 </IfModule> /etc/apache2/sites-enabled/vhost-nonssl <VirtualHost *:80> ServerAdmin webmaster@localhost ServerName vhost.domain.com DocumentRoot /home/xxx/sites/vhost/html RewriteEngine On RewriteCond %{HTTPS} off RewriteRule (.*) https://%{HTTP_HOST}:81%{REQUEST_URI} </Virtualhost> /etc/apache2/sites-enabled/vhost-ssl <VirtualHost *:81> ServerAdmin webmaster@localhost ServerName vhost.domain.com DocumentRoot /home/xxx/sites/vhost/html <Directory /> Options FollowSymLinks AllowOverride None AuthType Basic AuthName "Restricted Vhost" AuthUserFile /home/xxx/sites/vhost/.users Require valid-user </Directory> <Directory /home/xxx/sites/vhost/html/> Options -Indexes FollowSymLinks MultiViews AllowOverride None Order allow,deny allow from all </Directory> </VirtualHost> /etc/apache2/conf.d/phpmyadmin.conf Alias /phpmyadmin /usr/share/phpmyadmin (The rest of the default .conf truncated.) Everything in the apache config seems to work ok - the rewrite from non-ssl to ssl, the http authentication, the problem only happens when I am submitting the login form for phpmyadmin from https://vhost.domain.com:81/index.php. Other configs: The phpmyadmin config is completely default and the php.ini has only had some minor changes to memory and timeout limits. These seem to work fine, as mentioned, another php script runs with no problem and phpmyadmin works great once I manually enter in the correct schema after login. I'm looking for either a bandaid I can add to save me the trouble of manually entering in the https:// after login, a real fix that will make phpmyadmin behave as I think it should or some greater understanding of why my desired config is not possible.

    Read the article

  • Security Trimmed Cross Site Collection Navigation

    - by Sahil Malik
    Ad:: SharePoint 2007 Training in .NET 3.5 technologies (more information). This article will serve as documentation of a fully functional codeplex project that I just created. This project will give you a WebPart that will give you security trimmed navigation across site collections. The first question is, why create such a project? In every single SharePoint project you will do, one question you will always be faced with is, what should the boundaries of sites be, and what should the boundaries of site collections be? There is no good or bad answer to this, because it really really depends on your needs. There are some factors in play here. Site Collections will allow you to scale, as a Site collection is the smallest entity you can put inside a content database Site collections will allow you to offer different levels of SLAs, because you put a site collection on a separate content database, and put that database on a separate server. Site collections are a security boundary – and they can be moved around at will without affecting other site collections. Site collections are also a branding boundary. They are also a feature deployment boundary, so you can have two site collections on the same web application with completely different nature of services. But site collections break navigation, i.e. a site collection at “/”, and a site collection at “/sites/mySiteCollection”, are completely independent of each other. If you have access to both, the navigation of / won’t show you a link to /sites/mySiteCollection. Some people refer to this as a huge issue in SharePoint. Luckily, some workarounds exist. A long time ago, I had blogged about “Implementing Consistent Navigation across Site Collections”. That approach was a no-code solution, it worked – it gave you a consistent navigation across site collections. But, it didn’t work in a security trimmed fashion! i.e., if I don’t have access to Site Collection ‘X’, it would still show me a link to ‘X’. Well this project gets around that issue. Simply deploy this project, and it’ll give you a WebPart. You can use that WebPart as either a webpart or as a server control dropped via SharePoint designer, and it will give you Security Trimmed Cross Site Collection Navigation. The code has been written for SP2010, but it will work in SP2007 with the help of http://spwcfsupport.codeplex.com . What do I need to do to make it work? I’m glad you asked! Simple! Deploy the .wsp (which you can download here). This will give you a site collection feature called “Winsmarts Cross Site Collection Navigation” as shown below. Go ahead and activate it, and this will give you a WebPart called “Winsmarts Navigation Web Part” as shown below: Just drop this WebPart on your page, and it will show you all site collections that the currently logged in user has access to. Really it’s that easy! This is shown as below - In the above example, I have two site collections that I created at /sites/SiteCollection1 and /sites/SiteCollection2. The navigation shows the titles. You see some extraneous crap as well, you might want to clean that – I’ll talk about that in a minute. What? You’re running into problems? If the problem you’re running into is that you are prompted to login three times, and then it shows a blank webpart that says “Loading your applications ..” and then craps out!, then most probably you’re using a different authentication scheme. Behind the scenes I use a custom WCF service to perform this job. OOTB, I’ve set it to work with NTLM, but if you need to make it work alternate authentications such as forms based auth, or client side certs, you will need to edit the %14%\ISAPI\Winsmarts.CrossSCNav\web.config file, specifically, this section - 1: <bindings> 2: <webHttpBinding> 3: <binding name="customWebHttpBinding"> 4: <security mode="TransportCredentialOnly"> 5: <transport clientCredentialType="Ntlm"/> 6: </security> 7: </binding> 8: </webHttpBinding> 9: </bindings> For Kerberos, change the “clientCredentialType” to “Windows” For Forms auth, remove that transport line For client certs – well that’s a bit more involved, but it’s just web.config changes – hit a good book on WCF or hire me for a billion trillion $. But fair warning, I might be too busy to help immediately. If you’re running into a different problem, please leave a comment below, but the code is pretty rock solid, so .. hmm .. check what you’re doing! BTW, I don’t  make any guarantee/warranty on this – if this code makes you sterile, unpopular, bad hairstyle, anything else, that is your problem! But, there are some known issues - I wrote this as a concept – you can easily extend it to be more flexible. Example, hierarchical nav, or, horizontal nav, jazzy effects with jquery or silverlight– all those are possible very very easily. This webpart is not smart enough to co-exist with another instance of itself on the same page. I can easily extend it to do so, which I will do in my spare(!?) time! Okay good! But that’s not all! As you can see, just dropping the WebPart may show you many extraneous site collections, or maybe you want to restrict which site collections are shown, or exclude a certain site collection to be shown from the navigation. To support that, I created a property on the WebPart called “UrlMatchPattern”, which is a regex expression you specify to trim the results :). So, just edit the WebPart, and specify a string property of “http://sp2010/sites/” as shown below. Note that you can put in whatever regex expression you want! So go crazy, I don’t care! And this gives you a cleaner look.   w00t! Enjoy! Comment on the article ....

    Read the article

  • Introduction to Developing Mobile Web Applications in ASP.NET MVC 4

    - by bipinjoshi
    As mobile devices are becoming more and more popular, web developers are also finding it necessary to target mobile devices while building their web sites. While developing a mobile web site is challenging due to the complexity in terms of device detection, screen size and browser support, ASP.NET MVC4 makes a developer's life easy by providing easy ways to develop mobile web applications. To that end this article introduces you to the basics of developing web sites using ASP.NET MVC4 targeted at mobile devices.http://www.binaryintellect.net/articles/7a33d6fa-1dec-49fe-9487-30675d0a09f0.aspx

    Read the article

  • OS X Snow Leopard 10.6 Refuses to Load Websites the first time intermittently

    - by Brandon
    Many times when I am browsing the web, Snow Leopard will sit and load a site for 20 seconds or more, until it times out and says it cannot be displayed. If I refresh, it loads RIGHT away, every time. The issue is intermittent but happens from once every couple of days to a few times a day. So the long and short of it is this: Aluminum MacBook (Non-Pro) 2.4GHz Core2Duo, 4GB DDR3 I am using 10.6.6 but I have had this issue since 10.6.0 It happens in Firefox, Chrome, and Safari I have flushed my DNS (using the command 'blablabla flush') I am using custom DNS servers which I hoped would fix it but it had no effect* I am running Apache currently but haven't been for most of the time I've reformatted multiple times, always experiencing the issue I am on Cox cable internet, with a Motorola Surfboard & a Belkin F6D4230-4 v1 (Pre?) N wireless router. I've put the router in G only & N only & G+N to no effect It seems to be domain dependant as I can sometimes load the Google cache right away, and sometimes other sites will load but Google will refuse My Powerbook G4 with Leopard, other Windows XP laptops, & my wired Win7 desktop do not suffer from the issue. *I recently started using these to escape the awful Cox redirect page on timeouts I'm almost positive the issue has happened on other networks but I can't recall a specific instance (I have a terrible memory). The problem is intermittent and fixable enough (I just have to wait until it times out and hit refresh one time) but incredibly annoying since I'm constantly reading documentation from a large variety of sites. EDIT: To clarify, this happens with ALL sites, not only specific sites. I haven't been able to detect any pattern to the failures, but one day Google.com will refuse to load while reddit.com will, and the next day vice versa. Keep in mind that waiting for a timeout and hitting refresh loads the page right away, every time. If I don't wait for the timeout, opening more links, hitting refresh, and clicking the link a billion times have no effect. It seems to be domain neutral, affecting sites seemingly at random. It doesn't seem to have anything to do with connection inactivity either, because I will be SSHed into different servers, uploading files, browsing, downloading, etc, and it will just quit loading Jquery.com (for example) until I sit and wait for a timeout. /EDIT This is my last resort. Please, someone, tell me what is happening. Thank you.

    Read the article

  • How does a website latency simulator work

    - by nighthawk457
    Sites like webpagetest allow users to enter a website url and a test location, to run a speed test on the site from multiple locations using real browsers. Can anyone give me a basic idea of how sites like this work? You also have plugin's like Aptimize latency simulator or charles web debugging proxy app, that simulate the delay while accessing a site from different locations. I am assuming since these are plugin's these function in a different way. How do these plugin's work ?

    Read the article

< Previous Page | 37 38 39 40 41 42 43 44 45 46 47 48  | Next Page >