Search Results

Search found 7625 results on 305 pages for 'scraper sites'.

Page 48/305 | < Previous Page | 44 45 46 47 48 49 50 51 52 53 54 55  | Next Page >

  • IIS7 Rewrite rule being duplicated across 2 different websites (unwanted)

    - by Matt
    We have a IIS7 on Windows Server 2008. It is hosting a handful of sites, on a handful of ip addresses. 2 of those sites are actually wildcards on the domain: *.firstdomain.com *.seconddomain.com However, I am finding that any URL Rewrite rules I add for one of these "websites", is automatically in the URL Rewrite section for the other. Similarly, if I disable the rule in in one, it disables in the other. This doesn't happen with the other sites defined on this server, just these two. I look at the parent (top level, the server as a whole), and the rule is not there. Any idea what's going on here?

    Read the article

  • Restart single uWSGI application (when it's in emperor mode)

    - by Oli
    I'm running uWSGI in emperor mode to host a bunch of Django sites based on their individual configs. These are supposed to update when it detects a change in the config file and this largely works when I just touch uwsgi.ini the relevant file. But occasionally I'll mess something up in the Django site and the server won't load. Yeah, yeah, I should be testing better but that's not really the point. When this happens, uWSGI seems to mark the site as dead and stops trying to run it (seems to make sense). Even after I fix the underlying issue, no amount of touching will get that site's uWSGI process up and running. I have to reload the whole uWSGI server (knocking dozens of sites out at once for a few seconds). Is there a way to force uWSGI to just reload one of its sites?

    Read the article

  • Blocking facebook's Like button in firefox

    - by Quiark
    Many sites today use widgets from facebook such as the Like button, list of friends who are fans of that site and so on. While it may be a nice feature, I perceive it to be a serious privacy intrusion, because facebook most likely stores information about which sites you visit. I also heard that when you are not logged into facebook, it still tracks the sites you visit (probably with a cookie) and once you log in attaches the data to your real account. For now, I want to keep using facebook, but I would like to block just these widgets so it can't track me. Is there any Firefox extension which could do that?

    Read the article

  • Squid 2.7 in offline_mode yet tries to contact DNS servers to resolve addresses

    - by William C
    I installed Squid 2.7 to act as a web cache on my laptop, so that I can browse previously-visited sites when I don't have WiFi. Except http_access allow all, I've made no changes to the default squid.conf configuration. When I turn offline_mode ON and disconnect from the Internet, and then I visit sites, I encounter The following error was encountered: Unable to determine IP address from host name for whatever.sitename.com The dnsserver returned: Timeout on any site I visit. What settings do I need to add to squid.conf so I can browse sites offline?

    Read the article

  • How can I figure out which PHP extensions aren't being used?

    - by Tom Marthenal
    I manage a server (running Ubuntu) which hosts our client's sites with a few dozen different PHP-based websites, mostly small sites but also some installations of CMSes and forums. I used the get_loaded_extensions() method to see what extensions I have loaded. To help streamline the server (remove unnecessary extensions to make upgrading easier and marginally improve speed), I'd like to remove extensions that aren't being used by any of the sites. I currently have 54 different extensions loaded. I can easily eliminate some of these from the list which I know are used, but others I am less sure about. Is there some way that I can see extensions which have not been used recently?

    Read the article

  • How does the performance of pure Nginx compare to cpNginx?

    - by jb510
    There is now a Cpanel plugin to fairly easily setup Nginx as a reverse proxy on a Cpanel/Apache server. I've been simultaneously interested in setting up my first unmanaged VPS and my first Nginx server and as a masochist figured why not combine the two. I'm wondering however if it's worth setting up a pure Nginx server vs trying out cpNginx on Apache? My goal is solely to host WordPress sites and while what I've read raves about Nginx's is exceptional ability serving static at least as a reverse proxy, I am unclear if there is substantial benefit to running a pure nginx with eAccelorator over cpNginx on Apache for dynamic sites? Regardless I'll be running W3TC on all sites to cache content, but am still interested if there are big CPU reductions running PHP scripts under pure Nginx over cpNginx?

    Read the article

  • How can i simulate the production servers in my home for linux VMs [closed]

    - by user31
    I am thinking of making the small simulation of how the big companies run their system in my home environment to get the feeling. I have the server with 8GB ram , quad core processor. I am thinking of following setup if thats [possible because i have not worked with biger companies , so i want to know how can i do that I am thinking of creating 5 virtual machines VM1 will be database server and will have all databases like MySQL , postgreSQL , sqlite , mongodb and Oracle VM2 will be the web server and will have Apache and Tomcat installed VM3 will be the Filse server where i will have all the web sites file VM4 , i am thinking of as main box where i can install ptyon php java j2ee sites but not sure VM5 will have the server 22008 for c# .net applications my main idea is to be able to host the sites in php, python , java j2ee with spring Is my setup ok or i am missing few things. Please guide me with correct setup so that i can learn stuff

    Read the article

  • Limiting to the ServerName in Apache2

    - by David
    I have 2 sites defined in my Apache2. Each one has a servername. For example: Server 1 (first in sites-enabled) responds to www.example.com Server 2 (second in sites-enabled) responds to www.example2.com Ok, the problem is when I type the server IP in the URL, the first server responds. How could I limit the response to only specifying its servername? I would like to block the IP calls. If that is not possible, I would like the second server to respond, not the first. I cannot change the order because there are aliases defined in the second server that would override the first server config.

    Read the article

  • Issues With IIS Hosting Two Domains From Same Folder

    - by Bob Mc
    I have two different domain names that resolve to the same ASP.Net site. Both domains are hosted on the same server, which runs Windows Server 2003 and IIS6. The sites are differentiated in IIS Manager using host headers. However, both of the sites point to the same folder on the local drive for the site's page files. I am occasionally experiencing an ASP.Net error that says "The state information is invalid for this page and might be corrupted." I'm the site developer so I've addressed all the relevant code-related causes for this issue. However, I was wondering whether having two domains/sites sharing the same folder for an ASP.Net application might be causing this intermittent error. Also, is this generally a bad practice? Should I make separate, duplicate folders for each of the domains? Seems like that can become a maintenance headache.

    Read the article

  • Dynamic Virtual Hosts In Apache with www and non-www subdomains

    - by haukish
    I don't know apache very well and I've got a problem with configure mod_vhost_alias This is my httpd.conf file: UseCanonicalName Off LogFormat "%V %h %l %u %t \"%r\" %s %b" vcommon <Directory /var/www/sites/> Options FollowSymLinks AllowOverride All </Directory> <VirtualHost *:80> CustomLog logs/access_log.sites vcommon ServerAlias *.domain.com UseCanonicalName Off VirtualDocumentRoot /var/www/sites/%1/ </VirtualHost> Subdomains work fine without www. but I need to make them work with www too. Here's an example: something.domain.com - site is loading www.something.domain.com - Not Found What should I do?

    Read the article

  • Some Domain Clients unable to access certain websites

    - by Shaunie
    I have a small domain around 20 clients with a 2003 R2 SP2 DC. Most of my clients can browse the internet freely and dont have a problem. However a couple are reporting problems accessing certain sites. IE: Hotmail, skyscanner, bbc news They can browse the sites sometimes then other times they get 408\409 errors. other machines in the domain can access these sites. I have cleared out dns cache on these machines modified external dns servers on the DC still to no avail. The main issue is the person not able to access skyscanner uses it several times a day to book flights for employess going on leave or returning to work. both clients are running XP SP3 though one machine is getting change for one running win7 shortly. Any advice greatly appreciated. thanks

    Read the article

  • How can I keep persistent cookies from certain domains only?

    - by Mike L.
    This question is similar this one which covers Firefox, but I want to know how to do it in Chrome: I want Chrome to clear cookies from all sites accept those from certain domains. In the Cookies section of the *Content Settings I've made following selections: (*) Allow local data to be set (recommended) ( ) Allow local data to be set for the current session only ( ) Block sites from setting any data [ ] Block third-party cookies and site data [x] Clear cookies and other sites and plug-in data when I close my browser After logged in to my preferred website(s), I find the required domains listed when I click at All cookies and site data. Let's say, I find some cookies for mysite.comand www.mysite.com. Now I click at Manage exceptions and enter these items: Hostname Pattern Behavior ------------------------------------------- mysite.com Allow www.mysite.com Allow Unfortunately, this does not seem to work, because when I close Chrome and reopen it, all cookies are gone, even those from the configured mysite.com hosts.

    Read the article

  • Does downloading the same torrent from more than one server increase the sources / seeds?

    - by Petruza
    Does downloading the same torrent from more than one server increase the sources / seeds? edit: What I meant with servers is those like mininova, isohunt and the like. Now I know those aren't the trackers, what I want to know is if when more than one of those indexing sites are linked for the same torrent, if they likely contain different lists of trackers, so getting the torrent from both sites will increase the trackers, hence the seeds, or if it's just in case one of the sites goes down. I guess I can test this getting one torrent from one site with a torrent client and the same torrent, from another site with another client and comparing the list of trackers and the amount of seeders and leechers.

    Read the article

  • Chrome: middle mouse button overriden on certain web pages?

    - by buli
    Normally when I click on a link in Chrome with a middle mouse button, the page open s in a new tab. I find it useful and would like it to always work that way. But on some sites, specifically SharePoint sites I'm often working with, middle click on certain links opens the page in the same tab. Example can be found on this publicly available SharePoint demo page when clicking on "Internet Sites" or "SharePoint resources" links with middle mouse button. Those menu items seem like a link (they definitely have URL assigned). Why do them open in the same tab instead of a new one, when middle clicked? Can this behavior be changed in Chrome's settings or using some addon?

    Read the article

  • Sharepoint (active directory account creation mode) - Using STSADM

    - by vivek m
    This question is regarding using STSADM command to create new site collection in Active Directory Account creation mode. My setup is like this- I have 2 virtual PCs in a Windows XP Pro SP3 host. Both VPCs are Windows Server 2003 R2. One VPC acts as the DC, DNS Server, DHCP server, has Active Directory installed and is also the Database Server. The other VPC is the domain member and it is the IIS web server, POP/SMTP server and it has WSS 3.0 installed. I created a new site using the GUI in Central Admin page. For creating a site collection under the newly created site, I needed to use the STSADM command line tool since it cannot be done from Central Admin page in Active Directory Account creation mode. Thats where i got into a problem- stsadm.exe -o createsite -url http://vivek-c5ba48dca:1111/sites/Sales -owneremail [email protected] -sitetemplate STS#1 The format of the specified domain name is invalid. (Exception from HRESULT: 0x800704BC) The following is the output from the SHarepoint LOG- * stsadm: Running createsite 9e7d Medium Initializing the configuration database connection. 95kp High Creating site http://vivek-c5ba48dca:1111/sites/Sales in content database WSS_Content_Sharepoint_1111 95kq High Creating top level site at http://vivek-c5ba48dca:1111/sites/Sales 72jz Medium Creating site: URL "/sites/Sales" 72e1 High Unable to get domain DNS or forest DNS for domain sharepointsvc.com. ErrorCode=1212 8jvc Warning #1e0046: Adding user "spsalespadmin" to OU "sharepoint_ou" in domain "sharepointsvc.com" FAILED with HRESULT -2147023684. 72k1 High Cannot create site: "http://vivek-c5ba48dca:1111/sites/Sales" for owner "@\@", Error: , 0x800704bc 8e2s Medium Unknown SPRequest error occurred. More information: 0x800704bc 95ks Critical The site /sites/Sales could not be created. The following exception occured: The format of the specified domain name is invalid. (Exception from HRESULT: 0x800704BC). 72ju High stsadm: The format of the specified domain name is invalid. (Exception from HRESULT: 0x800704BC) Callstack: at Microsoft.SharePoint.Library.SPRequest.CreateSite(Guid gApplicationId, String bstrUrl, Int32 lZone, Guid gSiteId, Guid gDatabaseId, String bstrDatabaseServer, String bstrDatabaseName, String bstrDatabaseUsername, String bstrDatabasePassword, String bstrTitle, String bstrDescription, UInt32 nLCID, String bstrWebTemplate, String bstrOwnerLogin, String bstrOwnerUserKey, String bstrOwnerName, String bstrOwnerEmail, String bstrSecondaryContactLogin, String bstrSecondaryContactUserKey, String bstrSecondaryContactName, String bstrSecondaryContactEmail, Boolean bADAccountMode, Boolean bHostHeaderIsSiteName) at Microsoft.SharePoint.Administration.SPSiteCollection.Add(SPContentDataba... 72ju High ...se database, String siteUrl, String title, String description, UInt32 nLCID, String webTemplate, String ownerLogin, String ownerName, String ownerEmail, String secondaryContactLogin, String secondaryContactName, String secondaryContactEmail, String quotaTemplate, String sscRootWebUrl, Boolean useHostHeaderAsSiteName) at Microsoft.SharePoint.Administration.SPSiteCollection.Add(String siteUrl, String title, String description, UInt32 nLCID, String webTemplate, String ownerLogin, String ownerName, String ownerEmail, String secondaryContactLogin, String secondaryContactName, String secondaryContactEmail, Boolean useHostHeaderAsSiteName) at Microsoft.SharePoint.StsAdmin.SPCreateSite.Run(StringDictionary keyValues) at Microsoft.SharePoint.StsAdmin.SPStsAdmin.RunOperation(SPGlobalAdmi... 72ju High ...n globalAdmin, String strOperation, StringDictionary keyValues, SPParamCollection pars) 8wsw High Now terminating ULS (STSADM.EXE, onetnative.dll) * Seems to me that the trouble started with this - Unable to get domain DNS or forest DNS for domain sharepointsvc.com. ErrorCode=1212 Network connection to the sharepointsvc.com domain seems to be fine. C:\Program Files\Common Files\Microsoft Shared\web server extensions\12\BIN>stsadm -o getproperty -pn ADAccountDomain <Property Exist="Yes" Value="sharepointsvc.com" /> C:\Program Files\Common Files\Microsoft Shared\web server extensions\12\BIN>stsadm -o getproperty -pn ADAccountOU <Property Exist="Yes" Value="sharepoint_ou" /> C:\Program Files\Common Files\Microsoft Shared\web server extensions\12\BIN>nslookup sharepointsvc.com Server: vm-winsrvr2003.sharepointsvc.com Address: 192.168.0.5 Name: sharepointsvc.com Addresses: 192.168.0.21, 192.168.0.5 Is there any way of checking the domain connection from within Sharepoint (like using some getproperty of the STSADM tool) Does anyone have any clue about this ? (any pointers would be very helpful) Thanks.

    Read the article

  • Help with SVN+SSH permissions with CentOS/WHM setup

    - by Furiam
    Hi Folks, I'll try my best to explain how I'm trying to set up this system. Imagine a production server running WHM with various sites. We'll call these sites... site1, site2, site2 Now, with the WHM setup, each site has a user/group defined for them, we'll keep these users/groups called site1,site2 for simplicity reasons. Now, updating these sites is accomplished using SVN, and through the use of a post commit script to auto update these sites (With .svn blocked through the apache configuration). There are two regular maintainers of these sites, we'll call them Joe and Bob. Joe and Bob both have commandline access to the server through thier respective limited accounts. So I've done the easy bit, managed to get SVN working with these "maintainers" so that when an SVN commit occurs, the changes are checked out and go live perfectly. Here's the cavet, and ultimately my problem. User permissions. Through my testing of this setup, I've only managed to get it working by giving what is being updated permissions of 777, so that Joe and Bob can both read and write access to webfront directories for each of the sites. So, an example of how it's set up now: Joe and Bob both belong to a group called "Dev". I have the master /svn folders set up for both read and write access to this group, and it works great. Post commit triggers, updates the site, and then sets 777 on each file within the webfront. I then changed this to try and factor in group permission updates, instead of straight 777. Each folder in /home/site1/public_html intially gets given a chmod of 664, and each folder 775 Which looks a little something like this drwxrwxr-x . drwxrwxr-x .. drwxrwxr-x site1 site1 my_test_folder -rw-rw-r-- site1 site1 my_test_file So site1 is sthe owner and group owner of those files and folders. So I then added site1 to Joe and Bobs secondary groups so that the SVN update will correctly allow access to these files. Herein lies the problem now. When I wish to add a file or folder to /home/site1, say Bobs_file, it then looks like this drwxrwxr-x . drwxrwxr-x .. drwxr-xr-x Bob dev bobs_folder drwxrwxr-x site1 site1 my_test_folder -rw-rw-r-- Bob dev bobs_file -rw-rw-r-- site1 site1 my_test_file How can I get it so that with the set of user permissions Bob has available, to change the owner and group owner of that file to reflect "site1" "site1". As Bob belongs to Dev I can set the permissions correctly with CHMOd, but It appears CHGRP is throwing back operation errors. Now this was long winded enough to give an overview of exactly what I'm trying to accomplish, just incase I'm going about this arse-over-tit and there's a far easier solution. Here's my goals 2 people to update multiple user accounts specified given the structure of WHM Trying to maintain master user/group permissions of file and folders to the original user account, and not the account of the updatee. I like the security of SVN+SSH over just SVN. Don't want to run all this over root. I hope this made sense, and thanks in advance :)

    Read the article

  • Managing multiple reverse proxies for one virtual host in apache2

    - by Chris Betti
    I have many reverse proxies defined for my js-host VirtualHost, like so: /etc/apache2/sites-available/js-host <VirtualHost *:80> ServerName js-host.example.com [...] ProxyPreserveHost On ProxyPass /serviceA http://192.168.100.50/ ProxyPassReverse /serviceA http://192.168.100.50/ ProxyPass /serviceB http://192.168.100.51/ ProxyPassReverse /serviceB http://192.168.100.51/ [...] ProxyPass /serviceZ http://192.168.100.75/ ProxyPassReverse /serviceZ http://192.168.100.75/ </VirtualHost> The js-host site is acting as shared config for all of the reverse proxies. This works, but managing the proxies involves edits to the shared config, and an apache2 restart. Is there a way to manage individual proxies with a2ensite and a2dissite (or a better alternative)? My main objective is to isolate each proxy config as a separate file, and manage it via commands. First Attempt I tried making separate files with their own VirtualHost entries for each service: /etc/apache2/sites-available/js-host-serviceA <VirtualHost *:80> ServerName js-host.example.com [...] ProxyPass /serviceA http://192.168.100.50/ ProxyPassReverse /serviceA http://192.168.100.50/ </VirtualHost> /etc/apache2/sites-available/js-host-serviceB <VirtualHost *:80> ServerName js-host.example.com [...] ProxyPass /serviceB http://192.168.100.51/ ProxyPassReverse /serviceB http://192.168.100.51/ </VirtualHost> The problem with this is apache2 loads the first VirtualHost for a particular ServerName, and ignores the rest. They aren't "merged" somehow as I'd hoped.

    Read the article

  • Domain registration and DNS, what am I actually paying for? [on hold]

    - by jozxyqk
    Long story short I'm quite confused as to exactly what is offered by domain registration and dns service sites. When I go to the url "http://google.com", my PC connects to a name server and gets the IP for "google.com", then connects to the IP and says, give me the page for "http://google.com". AFAIK there are many name servers and they all cache these bits of information in some hierarchical network, but ultimately a DNS record must come from a single source (not sure what this is called). There are different kinds of records, that might not an IP but an alias/redirect to other records for example. Lets say I want my own domain name for some server. Maybe it even has a static IP but I want a nicer thing for people to remember, or my ISP assigns dynamic IPs and I want a URL that always works, or my website is hosted on a shared machine so the browser needs to send "http://mydnsname.com" to the webserver to distinguish it from other requests to the same IP but for different sites. Registering a domain costs a small amount of money per year. Where does this money go, not that I'm complaining :P? Is that really all it costs to maintain the entire DNS system of nameservers? If I just register the domain and nothing else, what do I get? Is that just reserving a name or hosting WHOIS information or have I paid for a dns recrord to be hosted? Can a domain alone have a record, such as an IP or be an alias to another? A bunch of sites out there offer other services, in addition to domain registration (I'm assuming they register the domain through another party for me). One example is "dynamic DNS" (DDNS), but isn't this just a regular DNS record that's updated regularly? Does it cost extra to update more often? Without a DDNS, can a DNS record still point to an IP? I've also seen the term "managed DNS" and have no idea where that fits in.

    Read the article

  • Configure TFS portal afterwards

    Update #1 January 8th, 2010: There is an updated post on this topic for Beta 2: http://www.ewaldhofman.nl/post/2009/12/10/Configure-TFS-portal-afterwards-Beta-2.aspx Update #2 October 10th, 2010: In the new Team Foundation Server Power Tools September 2010, there is now a command to create a portal. tfpt addprojectportal   Add or move portal for an existing team project Usage: tfpt addprojectportal /collection:uri                              /teamproject:"project name"                              /processtemplate:"template name"                              [/webapplication:"webappname"]                              [/relativepath:"pathfromwebapp"]                              [/validate]                              [/verbose] /collection Required. URL of Team Project Collection. /teamproject Required. Specifies the name of the team project. /processtemplate Required. Specifies that name of the process template. /webapplication The name of the SharePoint Web Application. Must also specify relativepath. /relativepath The path for the site relative to the root URL for the SharePoint Web Application. Must also specify webapplication. /validate Specifies that the user inputs are to be validated. If specified, only validation will be done and no portal setting will be changed. /verbose Switches on the verbose mode. I created a new Team Project in TFS 2010 Beta 1 and choose not to configure SharePoint during the creation of the Team Project. Of course I found out fairly quickly that a portal for TFS is very useful, especially the Iteration and the Product backlog workbooks and the dashboard reports. This blog describes how you can configure the sharepoint portal afterwards. Update: September 9th, 2009 Adding the portal afterwards is much easier as described below. Here are the steps Step 1: Create a new temporary project (with a SharePoint site for it). Open the Team Explorer Right click in the Team Explorer the root node (i.e. the project collection) Select "New team project" from the menu Walk throught he wizard and make sure you check the option to create the portal (which is by default checked) Step 2: Disable the site for the new project Open the Team Explorer Select the team project you created in step 1 In the menu click on Team -> Show Project Portal. In the menu click on Team -> Team Project Settings -> Portal Settings... The following dialog pops up Uncheck the option "Enable team project portal" Confirm the dialog with OK Step 3: Enable the site for the original one. Point it to the newly created site. Open the Team Explorer Select the team project you want to add the portal to In the menu open Team -> Team Project Settings -> Portal Settings... The same dialog as in step 2 pops up Check the option "Enable team project portal" Click on the "Configure URL" button The following dialog pops up   In the dialog select in the combobox of the web application the TFS server Enter in the Relative site path the text "sites/[Project Collection Name]/[Team Project Name created in step 1]" Confirm the "Specify an existing SharePoint Site" with OK Check the "Reports and dashboards refer to data for this team project" option Confirm the dialog "Project Portal Settings" with OK Step 4: Delete the temporary project you created. In Beta 1, I have found no way to delete a team project. Maybe it will be available in TFS 2010 Beta 2. Original post Step 1: Create new portal site Go to the sharepoint site of your project collection (/sites//default.aspx">/sites//default.aspx">http://<servername>/sites/<project_collection_name>/default.aspx) Click on the Site Actions at the left side of the screen and choose the option Site Settings In the site settings, choose the Sites and workspaces option Create a new site Enter the values for the Title, the description, the site address. And choose for the TFS2010 Agile Dashboard as template. Create the site, by clicking on the Create button Step 2: Integrate portal site with team project Open Visual Studio Open the Team Explorer (View -> Team Explorer) Select in the Team Explorer tool window the Team Project for which you are create a new portal Open the Project Portal Settings (Team -> Team Project Settings -> Portal Setings...) Check the Enable team project portal checkbox Click on Configure URL... You will get a new dialog as below Enter the url to the TFS server in the web application combobox And specify the relative site path: sites/<project collection>/<site name> Confirm with OK Check in the Project Portal Settings dialog the checkbox "Reports and dashboards refer to data for this team project" Confirm the settings with OK (this takes a while...) When you now browse to the portal, you will see that the dashboards are now showing up with the data for the current team project. Step 3: Download process template To get a copy of the documents that are default in a team project, we need to have a fresh set of files that are not attached to a team project yet. You can do that with the following steps. Start the Process Template Manager (Team -> Team Project Collection Settings -> Process Template Manager...) Choose the Agile process template and click on download Choose a folder to download Step 4: Add Product and Iteration backlog Go to the Team Explorer in Visual Studio Make sure the team project is in the list of team projects, and expand the team project Right click the Documents node, and choose New Document Library Enter "Shared Documents", and click on Add Right click the Shared Documents node and choose Upload Document Go the the file location where you stored the process template from step 3 and then navigate to the subdirectory "Agile Process Template 5.0\MSF for Agile Software Development v5.0\Windows SharePoint Services\Shared Documents\Project Management" Select in the Open Dialog the files "Iteration Backlog" and "Product Backlog", and click Open Step 5: Bind Iteration backlog workbook to the team project Right click on the "Iteration Backlog" file and select Edit, and confirm any warning messages Place your cursor in cell A1 of the Iteration backlog worksheet Switch to the Team ribbon and click New List. Select your Team Project and click Connect From the New List dialog, select the Iteration Backlog query in the Workbook Queries folder. The final step is to add a set of document properties that allow the workbook to communicate with the TFS reporting warehouse. Before we create the properties we need to collect some information about your project. The first piece of information comes from the table created in the previous step.  As you collect these properties, copy them into notepad so they can be used in later steps. Property How to retrieve the value? [Table name] Switch to the Design ribbon and select the Table Name value in the Properties portion of the ribbon [Project GUID] In the Visual Studio Team Explorer, right click your Team Project and select Properties.  Select the URL value and copy the GUID (long value with lots of characters) at the end of the URL [Team Project name] In the Properties dialog, select the Name field and copy the value [TFS server name] In the Properties dialog, select the Server Name field and copy the value [UPDATE] I have found that this is not correct: you need to specify the instance of your SQL Server. The value is used to create a connection to the TFS cube. Switch back to the Iteration Backlog workbook. Click the Office button and select Prepare – Properties. Click the Document Properties – Server drop down and select Advanced Properties. Switch to the Custom tab and add the following properties using the values you collected above. Variable name Value [Table name]_ASServerName [TFS server name] [Table name]_ASDatabase tfs_warehouse [Table name]_TeamProjectName [Team Project name] [Table name]_TeamProjectId [Project GUID] Click OK to close the properties dialog. It is possible that the Estimated Work (Hours) is showing the #REF! value. To resolve that change the formula with: =SUMIFS([Table name][Original Estimate]; [Table name][Iteration Path];CurrentIteration&"*";[Table name][Area Path];AreaPath&"*";[Table name][Work Item Type]; "Task") For example =SUMIFS(VSTS_ab392b55_6647_439a_bae4_8c66e908bc0d[Original Estimate]; VSTS_ab392b55_6647_439a_bae4_8c66e908bc0d[Iteration Path];CurrentIteration&"*";VSTS_ab392b55_6647_439a_bae4_8c66e908bc0d[Area Path];AreaPath&"*";VSTS_ab392b55_6647_439a_bae4_8c66e908bc0d[Work Item Type]; "Task") Also the Total Remaining Work in the Individual Capacity table may contain #REF! values. To resolve that change the formula with: =SUMIFS([Table name][Remaining Work]; [Table name][Iteration Path];CurrentIteration&"*";[Table name][Area Path];AreaPath&"*";[Table name][Assigned To];[Team Member];[Table name][Work Item Type]; "Task") For example =SUMIFS(VSTS_ab392b55_6647_439a_bae4_8c66e908bc0d[Remaining Work]; VSTS_ab392b55_6647_439a_bae4_8c66e908bc0d[Iteration Path];CurrentIteration&"*";VSTS_ab392b55_6647_439a_bae4_8c66e908bc0d[Area Path];AreaPath&"*";VSTS_ab392b55_6647_439a_bae4_8c66e908bc0d[Assigned To];[Team Member];VSTS_ab392b55_6647_439a_bae4_8c66e908bc0d[Work Item Type]; "Task") Save and close the workbook. Step 6: Bind Product backlog workbook to the team project Repeat the steps for binding the Iteration backlog for thiw workbook too. In the worksheet Capacity, the formula of the Storypoints might be missing. You can resolve it with: =IF([Iteration]="";"";SUMIFS([Table name][Story Points];[Table name][Iteration Path];[Iteration]&"*")) Example =IF([Iteration]="";"";SUMIFS(VSTS_487f1e4c_db30_4302_b5e8_bd80195bc2ec[Story Points];VSTS_487f1e4c_db30_4302_b5e8_bd80195bc2ec[Iteration Path];[Iteration]&"*"))

    Read the article

  • Tamilnadu HSC (+2) exam results – List of Websites

    - by samsudeen
    Tamilnadu State Board HSC ( +2 ) exam  results is expected to be release around the mid of this month ( probably on May 19). School students can get their marks at the same time from their respective schools. The results are usually published on websites or can availed from  mobile phone service providers through SMS. But it is for sure  most the sites will not be accessed for at least couple hours at the  time of result announcement Below are some of the quality web sites ( includes mirror sites to directly access the results page) that publishes the result links. http://www.tnresults.nic.in/ http://www.squarebrothers.com/ http://results.sify.com/ http://indiaresults.com/ http://www.dge1.tn.nic.in/ http://www.dge2.tn.nic.in/ http://www.dge3.tn.nic.in/ http://www.tngdc.in/ http://www.collegesintamilnadu.com/ http://www.classontheweb.com/ http://www.schools9.com/ http://www.chennaivision.com/ http://www.mygaruda.com/ http://www.tnagar.com/ http://www.indiacollegefinder.com/ http://www.chennaionline.com/ http://www.nakkeeran.com/ http://www.getyourscore.in/ http://www.examresults.net/ http://results.webdunia.com/ http://www.jayanews.in/ http://www.findchennai.com/ Join us on Facebook to read all our stories right inside your Facebook news feed.

    Read the article

  • Post build events using ROBOCOPY instead of XCOPY

    - by Vizioz Limited
    I don't know about you, but for a long time I have used XCOPY statements in my Visual Studio post build events to copy my Umbraco files from the project folders to the local version of the website associated with the project.For the last few months we have been building a website framework for a client, who has subsequently sold the site to 5 clients, each with a different skin and some variations in their functional requirements.So, we now have a single source solutions, that builds and copies the site files into 5 seperate local websites, which enables us to easily test them all, what we had found was that this process was starting to slow up our build process and was reaching 30-45 seconds on a high spec Quad core machine (and slower on others)Today I asked Colin to create seperate Solution Configurations within Visual Studio so that while we were developing we could target a single site, and when we wanted to test all sites, we could target "ALL" and the Post Build script would then copy the files to all sites.This worked well, and with a couple of other optimisations, our build was now taking about 10 seconds for a single site.Then Colin came across ROBOCOPY and suggested that maybe this would be a suitable alternative to XCOPY, well, I had not heard of it.. (shock horror some of you shout, some I am sure like me, are also wondering what it is!)ROBOCOPY is new in Windows Vista & Windows 7 (you can also download it for XP & Windows 2003) and it has a lot of additional features, the two that were most interesting to us were:/MIR = Mirror a folder tree/XD = Exclude Directories/NP = No Progress (i.e. it does not give you a chart of it's results, which just fills up your Output window!)So, we set about implementing ROBOCOPY, we decided to use the /MIR switch on all folders that we knew were always stored in our project folders:- images- css- masterpages- xsltAnd for other files we just used the straight robocopy functionality.We also decided to exclude all the .SVN directories using the /XD switch and finally we added the /NP switch as mentioned above.The beauty of all of this, is the /MIR functionality, as this means that only files that have changed will be copied across which greatly speeds up the process, especially on the images folders which previously copied across on every build, now, if we add a new image to the project it will be copied across automatically and then never again, unless we change it of course!The build time now for all sites is approximately 4 seconds and for a single site, 2 seconds, I would highly recommend the time to make the same optimisations to your build processes if you have not done so already.

    Read the article

  • Windows Azure Evolution &ndash; TFS Integration (WAWS Part 2)

    - by Shaun
    So this is the fourth blog post about the new features of Windows Azure and the second part of Windows Azure Web Sites. But this is not just focus on the WAWS since the function I’m going to introduce is available in both Windows Azure Web Sites and Windows Azure Cloud Service (a.k.a. hosted service). In the previous post I talked about the Windows Azure Web Sites and how to use its gallery to build a WordPress personal blog without coding. Besides the gallery we can create an empty web site and upload our website from vary approaches. And one of the highlighted feature here is that, we can make our web site integrated with a source control service, such as TFS and Git, so that it will be deployed automatically once a new commit or build available.   Create New Empty Web Site In the developer portal when creating a new web site, we can select QUICK CREATE item. This will create an empty web site with only one shared instance without any database associated. Let’s specify the URL, region and subscription and click OK. After a few seconds our website will be ready. And now we can click the BROWSE button to open this empty website. As you can see there is a welcome page available in my website even thought I didn’t upload or deploy anything. This means even though the website will be charged even before anything was deployed, similar as the cloud service (hosted service). It is because once we created a website, Windows Azure platform had arranged a hosting process (w3wp.exe) in the group of virtual machines.   Create Project in TFS Preview Service and Setup Link Currently the Windows Azure Web Sites can integrate with TFS and Git as its deployment source, and it only support the Microsoft TFS Preview Service for now. I will not deep into how to use the TFS preview service in this post but once we click into the website we had just created and then clicked the “Set up TFS publishing”, there will be a dialog helping us to connect to this service. If you don’t have an account you can click the link shown below to request one. Assuming we have already had an account of TFS service then we need to create a new project firstly. Go to your TFS service website and create a new project, giving the project name, description and the process template. Then, back to the developer portal and clicked the “Set up TFS publishing” link. In the popping up window I will provide my TFS service URL and click the “Authorize now” link. Click “Accept” button to allow my windows azure to connect to my TFS service. Then it will be back to the developer portal and list all projects in my account. Just select the one I had just created and click OK. Then our website is linking to the TFS project I specified and finally it will show similar like this below. This means the web site had been linked to the TFS successfully.   Work with TFS Preview Service in VS2010 In the figure above there are some links to guide us how to connect to the TFS server through Visual Studio 2010 and 2012 RC. If you are using Visual Studio 2012 RC, you don’t need any extension. But if you are using Visual Studio 2010 you must have SP1 and KB2581206 installed. To connect to my TFS service just open the Visual Studio and in the Team Explorer, we can add a new TFS server and paste the URL of my TFS service from the developer portal. And select the project I had just created, then it will be listed in my Team Explorer. Now let’s start to build our website. Since the website we are going to build will be deployed to WAWS, it’s NOT a cloud service, NOT a web role. So in this case we need to create a normal ASP.NET web application. For example, an ASP.NET MVC 3 web application. Next, right click on the solution and select “Add Solution to Source Control”, select the project I had just created. Then check my code in. Once the check-in finished we can see that there is a build running in the TFS server. And if we back to the developer portal, we will see in our web site deployment page there’s a deployment running. In fact, once we linked our web site to our TFS then it will create a new build definition in our TFS project. It will be triggered by each check-in and deploy to the web site we linked automatically. So that when our code had been compiled it will be published to our web site from our TFS server. Once the build and deployment finished we can see it’s now active on our developer portal. Now we can see the web site that created from my Visual Studio and deployed by my TFS.   Continue Deployment through VS and TFS A big benefit when using TFS publishing is the continue deployment. Now if I changed some code in my Visual Studio, for example update some text on the home page and check in my changes, then it will trigger an new build and deploy to my WAWS automatically. And even more, if we wanted to rollback to a previous version we can just select an existing deployment listed in the portal and click REDEPLOY at the bottom.   Q&A: Can Web Site use Storage work with a Worker Role? Stacy asked a question in my previous post, which was “can a web site use Windows Azure Storage and furthermore working with a worker role”. Since the web site is deployed on the windows azure virtual machines in data center, it must be able to use all windows azure features such as the storage, SQL databases, CDN, etc.. But since when using web site we normally have a standard ASP.NET web application, PHP website or NodeJS, the windows azure SDK was not referenced by default. But we can add them by ourselves. In our sample project let’s right click on my MVC project and clicked the “Manage NuGet packages”. And in the dialog I will search windows azure packages and select the “Windows Azure Storage” to install. Then we will have the assemblies to access windows azure storage such as tables, queues and blobs. Since I have a storage account already, let’s have a quick demo, just to list all blobs in a container. The code would be like this. 1: using System; 2: using System.Collections.Generic; 3: using System.Linq; 4: using System.Web; 5: using System.Web.Mvc; 6: using Microsoft.WindowsAzure; 7: using Microsoft.WindowsAzure.StorageClient; 8:  9: namespace WAASTFSDemo.Controllers 10: { 11: public class HomeController : Controller 12: { 13: public ActionResult Index() 14: { 15: ViewBag.Message = "Welcome to Windows Azure!"; 16:  17: var credentials = new StorageCredentialsAccountAndKey("[STORAGE_ACCOUNT]", "[STORAGE_KEY]"); 18: var account = new CloudStorageAccount(credentials, false); 19: var client = account.CreateCloudBlobClient(); 20: var container = client.GetContainerReference("shared"); 21: ViewBag.Blobs = container.ListBlobs().Select(b => b.Uri.AbsoluteUri); 22:  23: return View(); 24: } 25:  26: public ActionResult About() 27: { 28: return View(); 29: } 30: } 31: } 1: @{ 2: ViewBag.Title = "Home Page"; 3: } 4:  5: <h2>@ViewBag.Message</h2> 6: <p> 7: To learn more about ASP.NET MVC visit <a href="http://asp.net/mvc" title="ASP.NET MVC Website">http://asp.net/mvc</a>. 8: </p> 9: <div> 10: <ul> 11: @foreach (var blob in ViewBag.Blobs) 12: { 13: <li>@blob</li> 14: } 15: </ul> 16: </div> And then just check in the code, it will be deployed to my web site. Finally we can see the blobs in my storage.   This is just an example but it proves that web sites can connect to storage, table, blob and queue as well. So the answer to Stacy should be “yes”. The web site can use queue storage to work with worker role.   Summary In this post I demonstrated how to integrate with TFS from Windows Azure Web Sites. You can see our website can be built, uploaded and deployed automatically by TFS service. All we need to do is to provide the TFS name and select the project. Not only the Windows Azure Web Site, in this upgrade the Windows Azure Cloud Services (hosted service) can be published through TFS as well. Very similar as what we have shown below. But currently, only Microsoft TFS Service Preview can be integrated with Windows Azure. But I think in the future we can link the TFS in our enterprise and some 3rd party TFS such as CodePlex to Windows Azure.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Speaking at Dog Food Conference 2013

    - by Brian T. Jackett
    Originally posted on: http://geekswithblogs.net/bjackett/archive/2013/10/22/speaking-at-dog-food-conference-2013.aspx    It has been a couple years since I last attended / spoke at Dog Food Conference, but on Nov 21-22, 2013 I’ll be speaking at Dog Food Conference 2013 here in Columbus, OH.  For those of you confused by the name of the conference (no it’s not about dog food), read up on the concept of dogfooding .  This conference has a history of great sessions from local and regional speakers and I look forward to being a part of it once again.  Registration is now open (registration link) and is expected to sell out quickly.  Reserve your spot today.   Title: The Evolution of Social in SharePoint Audience and Level: IT Pro / Architect, Intermediate Abstract: Activities, newsfeed, community sites, following... these are just some of the big changes introduced to the social experience in SharePoint 2013. This class will discuss the evolution of the social components since SharePoint 2010, the architecture (distributed cache, microfeed, etc.) that supports the social experience, Yammer integration, and proper planning considerations when deploying social capabilities (personal sites, SkyDrive Pro and distributed cache). This session will include demos of the social newsfeed, community sites, and mentions. Attendees should have an intermediate knowledge of SharePoint 2010 or 2013 administration.         -Frog Out

    Read the article

  • extra configuration needed after installing SSL certificate?

    - by ptriek
    We recently developed two rather simple PHP applications for AXA (European bank). URL's are axa.tfo.be/incentives/cipres and axa.tfo.be/incentives/zrkk (access to both sites is restricted to visitors with cookies with encrypted passwords) On a previous security audit by an external company several security issues have been found. All these issues have been solved by a collleague PHP developer. However, one last requirement has been added - all data should be transfered over https. My php collegue is on holiday, however - and unavailable at the moment. So I contacted my host, and asked for installing SSL certificate. I myself have no knowledge/experience with SSL, so I'm a bit at loss for the following problems. Comodo SSL certificate + unique IP address has been installed today by my webhost for subdomain axa.tfo.be (by www.combell.be). However, it doesn't seem to be working. I posted a question about this earlier today, and was told not to worry, see link: http://serverfault.com/questions/339320/what-happens-if-you-install-an-ssl-certificate Current problems: the web applications aren't accessible over https, http works though (if a valid cookie is available) there's a static html page at http://axa.tfo.be/incentives/cipres/static.html, even that page is only accessible over http My webhost is telling me that 'my application probably doesn't support SSL', and has asked me to set an SSL variable to true in my php code. So my questions: I have basic knowledge of php, but don't know where to start regarding the 'php ssl variable'. The sites have been online for some time, and have been developed for regular php access. (Google didn't bring me any help, either.) Can anyone point me in the right direction, or give me some clues about whether/what I should ask my webhost for further assistance? (I'm a bit on a tight schedule, the sites will be audited again on monday, and it's a customer i wouldn't want to loose...) Thanks for looking into this, and sorry if my questions sound a bit nooby - I'm a webdesigner, not a server specialist...

    Read the article

  • WebCenter Customer Advisory Board meetings kick off Oracle Open World 2012!

    - by Lance711
    Welcome to OpenWorld! OpenWorld 2012 got underway today with a series of meetings with the members of the WebCenter Customer Advisory Board. Led by the WebCenter Product Management team, these meetings are a great way for the product team and customers to directly interact and discuss real-life business challenges, product details and to discuss upcoming features and functionality. This year, board members participated in discussions around live demos around product enhancements that will be featured throughout the coming week. Highlights included a variety of new mobile and social solutions, a great new user interface for WebCenter Content plus new Portal and Sites functionality that makes the experience for the everyday user a lot more pleasant. The day kicked off with Roel Stalman, VP of Product Management, giving a detailed overview of what’s new in WebCenter. Given all the improvements to discuss, this session went over 2 hours! Roel showcased the brand new UI for Content, Portal and Sites. He also gave live demos of the new mobile apps for WebCenter Content, Portal and the Oracle Social Network.  The attendees then broke into sub-groups in order to deep-dive with Product Management for the Portal, Sites, and Content product areas on specific functionality and application integrations. If you are here in San Francisco this week for OpenWorld, I definitely recommend stopping by the WebCenter area in the Moscone West Exhibition Hall to see some of this new functionality for yourself. And be sure to check out the WebCenter sessions throughout the week as those give us a chance to discuss direction and strategy, answer your questions and get your feedback and ideas. For those of you could not make it to OpenWorld this year, we miss you! You can stay in touch with what is happening via this blog and by following #oow and #webcenter on Twitter. Additionally, we will be rolling out details on upcoming products and release info over the coming months via this blog and web seminars. Stay tuned!

    Read the article

< Previous Page | 44 45 46 47 48 49 50 51 52 53 54 55  | Next Page >