Search Results

Search found 24291 results on 972 pages for 'site ripper'.

Page 12/972 | < Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >

  • Multiple cross-site scripting (XSS) vulnerabilities in OpenStack Horizon

    - by Ritwik Ghoshal
    CVE DescriptionCVSSv2 Base ScoreComponentProduct and Resolution CVE-2014-3473 cross-site scripting (XSS) vulnerability 4.3 OpenStack Horizon Solaris 11.2 11.2.1.5.0 CVE-2014-3474 cross-site scripting (XSS) vulnerability 4.3 CVE-2014-3475 cross-site scripting (XSS) vulnerability 4.3 This notification describes vulnerabilities fixed in third-party components that are included in Oracle's product distributions.Information about vulnerabilities affecting Oracle products can be found on Oracle Critical Patch Updates and Security Alerts page.

    Read the article

  • Learn How to Start a Membership Site

    Each and every individual is curious to find out how to start a membership site. A membership site is a site where a visitor has to pay to view the contents, and unauthorized viewers are prohibited from surfing such sites.

    Read the article

  • Create a new site programmatically and select parent site? (SharePoint)

    - by peter
    Hi, I am using the following code to create a new site: newWeb = SPContext.GetContext(HttpContext.Current).Web.Webs.Add(newSiteUrl, newSiteName, null, (uint)1033, siteTemplate, true, false); try { newWeb.Update(); } NewSiteUrl and newSiteName are values from two textboxes and on whichever site I use this code (in a web part) the new site will be a subsite to this site. I would now like to be able to select a parent site so that the new site can sit anywhere in the site collection, not just as a subsite to the site where I use the web part. I created the following function to get all the sites in the site collection and populate a drop down with the name and url for every site private void getSites() { SPSite oSiteCollection = SPContext.Current.Site; SPWebCollection collWebsite = oSiteCollection.AllWebs; for (int i = 0; i < collWebsite.Count; i++) { ddlParentSite.Items.Add(new ListItem(collWebsite[i].Title, collWebsite[i].Url)); } oSiteCollection.Dispose(); } If the user selects a site in the dropdown, is it possible to use that URL in newSiteUrl so decide where the new site should be? I don't get it to work really and the new site still becomes a subsite to the current one. I guess it has to do with HttpContext.Current? Any ideas on how I should do it instead? It's the first time I write custom web parts and the sharepoint object model is a bit overwhelming at the moment. Thanks in advance.

    Read the article

  • MacX DVD Ripper Pro is Free for How-To Geek Readers (Time Limited!)

    - by The Geek
    Want to rip a DVD to your hard drive, but don’t have a software package to do it? How-To Geek readers can get the normally non-free MacX DVD Ripper Pro for free, but only if you download your copy and install it before Saturday. Here’s how to get it. This time-limited offer is available to anybody for the next couple of days—just head to the download site, install the software package, and use the key provided. It’s as simple as that. Note: despite the confusion of the name, it’s available for both Mac and Windows. Latest Features How-To Geek ETC MacX DVD Ripper Pro is Free for How-To Geek Readers (Time Limited!) HTG Explains: What’s a Solid State Drive and What Do I Need to Know? How to Get Amazing Color from Photos in Photoshop, GIMP, and Paint.NET Learn To Adjust Contrast Like a Pro in Photoshop, GIMP, and Paint.NET Have You Ever Wondered How Your Operating System Got Its Name? Should You Delete Windows 7 Service Pack Backup Files to Save Space? Save Files Directly from Your Browser to the Cloud in Chrome and Iron The Steve Jobs Chronicles – Charlie and the Apple Factory [Video] Google Chrome Updates; Faster, Cleaner Menus, Encrypted Password Syncing, and More Glowing Chess Set Combines LEDs, Chess, and DIY Electronics Fun Peaceful Alpine River on a Sunny Day [Wallpaper] Fast Society Creates Mini and Mobile Temporary Social Networks

    Read the article

  • PHP XPath - how to get value from result

    - by LeeTee
    I have the followng XML file from Ebay <GeteBayDetailsResponse xmlns="urn:ebay:apis:eBLBaseComponents"><Timestamp>2012-07-04T12:02:14.541Z</Timestamp><Ack>Success</Ack><Version>779</Version><Build>E779_INTL_BUNDLED_14986004_R1</Build><SiteDetails><Site>US</Site><SiteID>0</SiteID><DetailVersion>1</DetailVersion><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></SiteDetails><SiteDetails><Site>Canada</Site><SiteID>2</SiteID><DetailVersion>1</DetailVersion><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></SiteDetails><SiteDetails><Site>UK</Site><SiteID>3</SiteID><DetailVersion>1</DetailVersion><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></SiteDetails><SiteDetails><Site>Germany</Site><SiteID>77</SiteID><DetailVersion>1</DetailVersion><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></SiteDetails><SiteDetails><Site>Australia</Site><SiteID>15</SiteID><DetailVersion>1</DetailVersion><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></SiteDetails><SiteDetails><Site>France</Site><SiteID>71</SiteID><DetailVersion>1</DetailVersion><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></SiteDetails><SiteDetails><Site>eBayMotors</Site><SiteID>100</SiteID><DetailVersion>1</DetailVersion><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></SiteDetails><SiteDetails><Site>Italy</Site><SiteID>101</SiteID><DetailVersion>1</DetailVersion><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></SiteDetails><SiteDetails><Site>Netherlands</Site><SiteID>146</SiteID><DetailVersion>1</DetailVersion><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></SiteDetails><SiteDetails><Site>Spain</Site><SiteID>186</SiteID><DetailVersion>1</DetailVersion><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></SiteDetails><SiteDetails><Site>India</Site><SiteID>203</SiteID><DetailVersion>1</DetailVersion><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></SiteDetails><SiteDetails><Site>HongKong</Site><SiteID>201</SiteID><DetailVersion>1</DetailVersion><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></SiteDetails><SiteDetails><Site>Singapore</Site><SiteID>216</SiteID><DetailVersion>1</DetailVersion><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></SiteDetails><SiteDetails><Site>Malaysia</Site><SiteID>207</SiteID><DetailVersion>1</DetailVersion><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></SiteDetails><SiteDetails><Site>Philippines</Site><SiteID>211</SiteID><DetailVersion>1</DetailVersion><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></SiteDetails><SiteDetails><Site>CanadaFrench</Site><SiteID>210</SiteID><DetailVersion>1</DetailVersion><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></SiteDetails><SiteDetails><Site>Poland</Site><SiteID>212</SiteID><DetailVersion>1</DetailVersion><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></SiteDetails><SiteDetails><Site>Belgium_Dutch</Site><SiteID>123</SiteID><DetailVersion>1</DetailVersion><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></SiteDetails><SiteDetails><Site>Belgium_French</Site><SiteID>23</SiteID><DetailVersion>1</DetailVersion><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></SiteDetails><SiteDetails><Site>Austria</Site><SiteID>16</SiteID><DetailVersion>1</DetailVersion><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></SiteDetails><SiteDetails><Site>Switzerland</Site><SiteID>193</SiteID><DetailVersion>1</DetailVersion><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></SiteDetails><SiteDetails><Site>Ireland</Site><SiteID>205</SiteID><DetailVersion>1</DetailVersion><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></SiteDetails><UpdateTime>2009-07-09T10:48:17.000Z</UpdateTime></GeteBayDetailsResponse> I need to get the value of node SiteID where the node Site matches whatever variable I pass. The code I have is: $xml_file ='SiteDetails.xml'; $xmlDoc = new DomDocument(); $xmlDoc->load($xml_file); $xpath = new DOMXpath($xmlDoc); $siteIDList = $xpath->query("/GeteBayDetailsResponse/SiteDetails[Site=$site]/SiteID"); var_dump($siteIDList); echo $siteIDList->SiteID; I get the following result: object(DOMNodeList)#11 (0) { } Can anyone help? I want to get a value of 3.

    Read the article

  • 10 steps to enable &lsquo;Anonymous Access&rsquo; for your SharePoint 2010 site

    - by KunaalKapoor
    What’s Anonymous Access? Anonymous access to your SharePoint site enables all visitors to view your SharePoint site anonymously without having to log in. With this blog I’d like to go through an easy step wise procedure to enable/set up anonymous access. Before you actually enable anonymous access on the site, you’ll have to change some settings at the web app level. So let’s start with that: Prerequisite(s): 1. A hosted SharePoint 2010 farm/server. 2. An existing SharePoint site. I just thought I’d mention the above pre-reqs, since the steps mentioned below would’nt be valid or a different type of a site. Step 1: In Central Administration, under Application Management, click on the Manage web applications. Step 2: Now select the site you want to enable anonymous access and click on the Authentication Providers icon. Step 3: On the modal window click on the Default zone. Step 4: Now under the Edit Authentication section, check Enable anonymous access and click Save. This is basically to make the Anonymous Access authentication mechanism available at the web app level @ IIS. Now, web application will allow anonymous access to be set. 5. Going back to Web Application Management click on the Anonymous Policy icon. Step 6: Also before we proceed any further, under the Anonymous Access Restrictions (@ web app mgmt.) select your Zone and set the Permissions to None – No policy and click Save. Step 7:  Now lets navigate to your top level site collection for the web application. Click the Site Actions > Site Settings. Under Users and Permissions click Site permissions. Step 8: Under Users and Permissions, click on Site Permissions. Step 9: Under the Edit tab, click on Anonymous Access. Step 10: Choose whether you want Anonymous users to have access to the entire Web site or to lists and libraries only, and then click on OK. You should now be able to see the view as below under your permissions Also keep in mind: If you are trying to access the site from a browser within the domain, then you’ll need to change some browser settings to see the after affects. Normally this is because the browsers (Internet Explorer) is set to log in automatically to intranet zone only , not sure if you have explicitly changed the zones and added it to trusted sites. If this is from a box within your domain please try to access the site by temporarily changing the Internet Explorer setting to Anonymous Logon on the zone that the site is added example "Intranet" and try . You will find the same settings by clicking on Tools > Internet Options > Security Tab.

    Read the article

  • How to determine the correct (case sensitive) URL for a SharePoint site

    - by Goyuix
    SharePoint is generally very tolerant of accepting a URL in a case-insensitive fashion, however there are a few cases where it completely breaks down. For example, when creating a site column it somehow stores and uses the URL when it was created, and when trying to edit the field definition through the Site Column Gallery (fldedit.aspx page in the LAYOUTS) you end up throwing the error below. Value does not fall within the expected range. at Microsoft.SharePoint.SPFieldCollection.GetFieldByInternalName(String strName, Boolean bThrowException) at Microsoft.SharePoint.SPFieldCollection.GetFieldByInternalName(String strName) at Microsoft.SharePoint.ApplicationPages.BasicFieldEditPage.OnLoad(EventArgs e) at System.Web.UI.Control.LoadRecursive() at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) How can I reliably get the correct URL for a site/web? The SPSite.Url and SPWeb.Url properties seem to return back whatever case they are instantiated with. In other words, the site collection is provisioned using the following URL: http://server/Path/Site If I create a new Site Column using the SharePoint object model and happen to use http://server/path/site when instantiating the SPSite and SPWeb objects, the site column will be made available but when trying to access it through the gallery the error above is generated. If I correct the URL in the address bar, I can still view/modify the definition for the SPField in question, but the default URL that is generated is bogus. Clear as mud? Example code: (this is a bad example because of the case sensitivity issue) // note: site should be partially caps: http://server/Path/Site using (SPSite site = new SPSite("http://server/path/site")) { using (SPWeb web = site.OpenWeb()) { web.Fields.AddFieldAsXml("..."); // correct XML really here } }

    Read the article

  • How to create managed properties at site collection level in SharePoint2013

    - by ybbest
    In SharePoint2013, you can create managed properties at site collection. Today, I’d like to show you how to do so through PowerShell. 1. Define your managed properties and crawled properties and managed property Type in an external csv file. PowerShell script will read this file and create the managed and the mapping. 2. As you can see I also defined variant Type, this is because you need the variant type to create the crawled property. In order to have the crawled properties, you need to do a full crawl and also make sure you have data populated for your custom column. However, if you do not want to a full crawl to create those crawled properties, you can create them yourself by using the PowerShell; however you need to make sure the crawled properties you created have the same name if created by a full crawl. Managed properties type: Text = 1 Integer = 2 Decimal = 3 DateTime = 4 YesNo = 5 Binary = 6 Variant Type: Text = 31 Integer = 20 Decimal = 5 DateTime = 64 YesNo = 11 3. You can use the following script to create your managed properties at site collection level, the differences for creating managed property at site collection level is to pass in the site collection id. param( [string] $siteUrl="http://SP2013/", [string] $searchAppName = "Search Service Application", $ManagedPropertiesList=(IMPORT-CSV ".\ManagedProperties.csv") ) Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue $searchapp = $null function AppendLog { param ([string] $msg, [string] $msgColor) $currentDateTime = Get-Date $msg = $msg + " --- " + $currentDateTime if (!($logOnly -eq $True)) { # write to console Write-Host -f $msgColor $msg } # write to log file Add-Content $logFilePath $msg } $scriptPath = Split-Path $myInvocation.MyCommand.Path $logFilePath = $scriptPath + "\CreateManagedProperties_Log.txt" function CreateRefiner {param ([string] $crawledName, [string] $managedPropertyName, [Int32] $variantType, [Int32] $managedPropertyType,[System.GUID] $siteID) $cat = Get-SPEnterpriseSearchMetadataCategory –Identity SharePoint -SearchApplication $searchapp $crawledproperty = Get-SPEnterpriseSearchMetadataCrawledProperty -Name $crawledName -SearchApplication $searchapp -SiteCollection $siteID if($crawledproperty -eq $null) { Write-Host AppendLog "Creating Crawled Property for $managedPropertyName" Yellow $crawledproperty = New-SPEnterpriseSearchMetadataCrawledProperty -SearchApplication $searchapp -VariantType $variantType -SiteCollection $siteID -Category $cat -PropSet "00130329-0000-0130-c000-000000131346" -Name $crawledName -IsNameEnum $false } $managedproperty = Get-SPEnterpriseSearchMetadataManagedProperty -Identity $managedPropertyName -SearchApplication $searchapp -SiteCollection $siteID -ErrorAction SilentlyContinue if($managedproperty -eq $null) { Write-Host AppendLog "Creating Managed Property for $managedPropertyName" Yellow $managedproperty = New-SPEnterpriseSearchMetadataManagedProperty -Name $managedPropertyName -Type $managedPropertyType -SiteCollection $siteID -SearchApplication $searchapp -Queryable:$true -Retrievable:$true -FullTextQueriable:$true -RemoveDuplicates:$false -RespectPriority:$true -IncludeInMd5:$true } $mappedProperty = $crawledproperty.GetMappedManagedProperties() | ?{$_.Name -eq $managedProperty.Name } if($mappedProperty -eq $null) { Write-Host AppendLog "Creating Crawled -> Managed Property mapping for $managedPropertyName" Yellow New-SPEnterpriseSearchMetadataMapping -CrawledProperty $crawledproperty -ManagedProperty $managedproperty -SearchApplication $searchapp -SiteCollection $siteID } $mappedProperty = $crawledproperty.GetMappedManagedProperties() | ?{$_.Name -eq $managedProperty.Name } #Get-FASTSearchMetadataCrawledPropertyMapping -ManagedProperty $managedproperty } $searchapp = Get-SPEnterpriseSearchServiceApplication $searchAppName $site= Get-SPSite $siteUrl $siteId=$site.id Write-Host "Start creating Managed properties" $i = 1 FOREACH ($property in $ManagedPropertiesList) { $propertyName=$property.managedPropertyName $crawledName=$property.crawledName $managedPropertyType=$property.managedPropertyType $variantType=$property.variantType Write-Host $managedPropertyType Write-Host "Processing managed property $propertyName $($i)..." $i++ CreateRefiner $crawledName $propertyName $variantType $managedPropertyType $siteId Write-Host "Managed property created " $propertyName } Key Concepts Crawled Properties: Crawled properties are discovered by the search index service component when crawling content. Managed Properties: Properties that are part of the Search user experience, which means they are available for search results, advanced search, and so on, are managed properties. Mapping Crawled Properties to Managed Properties: To make a crawled property available for the Search experience—to make it available for Search queries and display it in Advanced Search and search results—you must map it to a managed property. References Administer search in SharePoint 2013 Preview Managing Metadata New-SPEnterpriseSearchMetadataCrawledProperty New-SPEnterpriseSearchMetadataManagedProperty Remove-SPEnterpriseSearchMetadataManagedProperty Overview of crawled and managed properties in SharePoint 2013 Preview Remove-SPEnterpriseSearchMetadataManagedProperty SharePoint 2013 – Search Service Application

    Read the article

  • How to create managed properties at site collection level in SharePoint2013

    - by ybbest
    In SharePoint2013, you can create managed properties at site collection. Today, I’d like to show you how to do so through PowerShell. 1. Define your managed properties and crawled properties and managed property Type in an external csv file. PowerShell script will read this file and create the managed and the mapping. 2. As you can see I also defined variant Type, this is because you need the variant type to create the crawled property. In order to have the crawled properties, you need to do a full crawl and also make sure you have data populated for your custom column. However, if you do not want to a full crawl to create those crawled properties, you can create them yourself by using the PowerShell; however you need to make sure the crawled properties you created have the same name if created by a full crawl. Managed properties type: Text = 1 Integer = 2 Decimal = 3 DateTime = 4 YesNo = 5 Binary = 6 Variant Type: Text = 31 Integer = 20 Decimal = 5 DateTime = 64 YesNo = 11 3. You can use the following script to create your managed properties at site collection level, the differences for creating managed property at site collection level is to pass in the site collection id. param( [string] $siteUrl="http://SP2013/", [string] $searchAppName = "Search Service Application", $ManagedPropertiesList=(IMPORT-CSV ".\ManagedProperties.csv") ) Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue $searchapp = $null function AppendLog { param ([string] $msg, [string] $msgColor) $currentDateTime = Get-Date $msg = $msg + " --- " + $currentDateTime if (!($logOnly -eq $True)) { # write to console Write-Host -f $msgColor $msg } # write to log file Add-Content $logFilePath $msg } $scriptPath = Split-Path $myInvocation.MyCommand.Path $logFilePath = $scriptPath + "\CreateManagedProperties_Log.txt" function CreateRefiner {param ([string] $crawledName, [string] $managedPropertyName, [Int32] $variantType, [Int32] $managedPropertyType,[System.GUID] $siteID) $cat = Get-SPEnterpriseSearchMetadataCategory –Identity SharePoint -SearchApplication $searchapp $crawledproperty = Get-SPEnterpriseSearchMetadataCrawledProperty -Name $crawledName -SearchApplication $searchapp -SiteCollection $siteID if($crawledproperty -eq $null) { Write-Host AppendLog "Creating Crawled Property for $managedPropertyName" Yellow $crawledproperty = New-SPEnterpriseSearchMetadataCrawledProperty -SearchApplication $searchapp -VariantType $variantType -SiteCollection $siteID -Category $cat -PropSet "00130329-0000-0130-c000-000000131346" -Name $crawledName -IsNameEnum $false } $managedproperty = Get-SPEnterpriseSearchMetadataManagedProperty -Identity $managedPropertyName -SearchApplication $searchapp -SiteCollection $siteID -ErrorAction SilentlyContinue if($managedproperty -eq $null) { Write-Host AppendLog "Creating Managed Property for $managedPropertyName" Yellow $managedproperty = New-SPEnterpriseSearchMetadataManagedProperty -Name $managedPropertyName -Type $managedPropertyType -SiteCollection $siteID -SearchApplication $searchapp -Queryable:$true -Retrievable:$true -FullTextQueriable:$true -RemoveDuplicates:$false -RespectPriority:$true -IncludeInMd5:$true } $mappedProperty = $crawledproperty.GetMappedManagedProperties() | ?{$_.Name -eq $managedProperty.Name } if($mappedProperty -eq $null) { Write-Host AppendLog "Creating Crawled -> Managed Property mapping for $managedPropertyName" Yellow New-SPEnterpriseSearchMetadataMapping -CrawledProperty $crawledproperty -ManagedProperty $managedproperty -SearchApplication $searchapp -SiteCollection $siteID } $mappedProperty = $crawledproperty.GetMappedManagedProperties() | ?{$_.Name -eq $managedProperty.Name } #Get-FASTSearchMetadataCrawledPropertyMapping -ManagedProperty $managedproperty } $searchapp = Get-SPEnterpriseSearchServiceApplication $searchAppName $site= Get-SPSite $siteUrl $siteId=$site.id Write-Host "Start creating Managed properties" $i = 1 FOREACH ($property in $ManagedPropertiesList) { $propertyName=$property.managedPropertyName $crawledName=$property.crawledName $managedPropertyType=$property.managedPropertyType $variantType=$property.variantType Write-Host $managedPropertyType Write-Host "Processing managed property $propertyName $($i)..." $i++ CreateRefiner $crawledName $propertyName $variantType $managedPropertyType $siteId Write-Host "Managed property created " $propertyName } Key Concepts Crawled Properties: Crawled properties are discovered by the search index service component when crawling content. Managed Properties: Properties that are part of the Search user experience, which means they are available for search results, advanced search, and so on, are managed properties. Mapping Crawled Properties to Managed Properties: To make a crawled property available for the Search experience—to make it available for Search queries and display it in Advanced Search and search results—you must map it to a managed property. References Administer search in SharePoint 2013 Preview Managing Metadata

    Read the article

  • SharePoint 2010 Hosting :: Error – HTTP Error 401.1 when Accessing Your SharePoint 2010 Site

    - by mbridge
    When attempting to view a MOSS (SharePoint) 2007 or SharePoint 2010 site locally from a Web Front End (WFE) you get an error stating: “HTTP Error 401.1 – Unauthorized: Access is denied due to invalid credentials.” I have noticed that this happens on Windows 2003/2008 Server SP1/SP2/R2 when using Host Headers and Alternate Access Mappings on a web application in MOSS 2007. If you can access the site from remote machines and cannot access the site from the server itself, then this might be your issue. For all my newer farm installs this includes SharePoint 2007 (MOSS) and SharePoint 2010. I use method number 2 on all SharePoint and SQL Servers in the farm. If you cannot access the web site locally or remotely from other machines then there is an issue with security on the site and/or possibly a Kerberos related security issue I implemented fix #2 listed in the following Microsoft KB Article. I implemented this fix on all servers in the MOSS 2007 Farm (WFE’s and Indexing/Search Server). If using method 1, you would add all Host Headers and Alternate Access Mappings for all web applications to the BackConnectionHostNames value, then you will be able to access the sites locally from the WFE’s. Microsoft KB Link: http://support.microsoft.com/kb/896861 Method 1: Specify Host Names Please follow this steps: 1. Click Start, click Run, type regedit, and then click OK. 2. In Registry Editor, locate and then click the following registry key: HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Lsa\MSV1_0 3. Right-click MSV1_0, point to New, and then click Multi-String Value. 4. Type BackConnectionHostNames, and then press ENTER. 5. Right-click BackConnectionHostNames, and then click Modify. 6. In the Value data box, type the host name or the host names for the sites that are on the local computer, and then click OK. 7. Quit Registry Editor, and then restart the IISAdmin service. Method 2: Disable the Loopback Check  Please follow this steps: 1. Click Start, click Run, type regedit, and then click OK 2. In Registry Editor, locate and then click the following registry key: HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Lsa 3. Right-click Lsa, point to New, and then click DWORD Value. 4. Type DisableLoopbackCheck, and then press ENTER. 5. Right-click DisableLoopbackCheck, and then click Modify. 6. In the Value data box, type 1, and then click OK. 7. Quit Registry Editor, and then restart your computer. Give it try and good luck.

    Read the article

  • Redirect all access requests to a domain and subdomain(s) except from specific IP address? [closed]

    - by Christopher
    This is a self-answered question... After much wrangling I found the magic combination of mod_rewrite rules so I'm posting here. My scenario is that I have two domains - domain1.com and domain2.com - both of which are currently serving identical content (by way of a global 301 redirect from domain1 to domain2). Domain1 was then chosen to be repurposed to be a 'portal' domain - with a corporate CMS-based site leading off from the front page, and the existing 'retail' domain (domain2) left to serve the main web site. In addition, a staging subdomain was created on domain1 in order to prepare the new corporate site without impinging on the root domain's existing operation. I contemplated just rewriting all requests to domain2 and setting up the new corporate site 'behind the scenes' without using a staging domain, but I usually use subdomains when setting up new sites. Finally, I required access to the 'actual' contents of the domains and subdomains - i.e., to not be redirected like all other visitors - in order that I can develop the new site and test it in the staging environment on the live server, as I'm not using a separate development webserver in this case. I also have another test subdomain on domain1 which needed to be preserved. The way I eventually set it up was as follows: (10.2.2.1 would be my home WAN IP) .htaccess in root of domain1 RewriteEngine On RewriteCond %{REMOTE_ADDR} !^10\.2\.2\.1 RewriteCond %{HTTP_HOST} !^staging.domain1.com$ [NC] RewriteCond %{HTTP_HOST} !^staging2.domain1.com$ [NC] RewriteRule ^(.*)$ http://domain2.com/$1 [R=301] .htaccess in staging subdomain on domain1: RewriteEngine On RewriteCond %{REMOTE_ADDR} !^10\.2\.2\.1 RewriteCond %{HTTP_HOST} ^staging.revolver.coop$ [NC] RewriteRule ^(.*)$ http://domain2.com/$1 [R=301,L] The multiple .htaccess files and multiple rulesets require more processing overhead and longer iteration as the visitor is potentially redirected twice, however I find it to be a more granular method of control as I can selectively allow more than one IP address access to individual staging subdomain(s) without automatically granting them access to everything else. It also keeps the rulesets fairly simple and easy to read. (or re-interpret, because I'm always forgetting how I put rules together!) If anybody can suggest a more efficient way of merging all these rules and conditions into just one main ruleset in the root of domain1, please post! I'm always keen to learn, this post is more my attempt to preserve this information for those who are looking to redirect entire domains for all visitors except themselves (for design/testing purposes) and not just denying specific file access for maintenance mode (there are many good examples of simple mod_rewrite rules for 'maintenance mode' style operation easily findable via Google). You can also extend the IP address detection - firstly by using wildcards ^10\.2\.2\..*: the last octet's \..* denotes the usual "." and then "zero or more arbitrary characters", signified by the .* - so you can specify specific ranges of IPs in a subnet or entire subnets if you wish. You can also use square brackets: ^10\.2\.[1-255]\.[120-140]; ^10\.2\.[1-9]?[0-9]\.; ^10\.2\.1[0-1][0-9]\. etc. The third way, if you wish to specify multiple discrete IP addresses, is to bracket them in the style of ^(1.1.1.1|2.2.2.2|3.3.3.3)$, and you can of course use square brackets to substitute octets or single digits again. NB: if you're using individual RewriteCond lines to specify multiple IPs / ranges, make sure to put [OR] at the end of each one otherwise mod_rewrite will interpret as "if IP address matches 1.1.1.1 AND if IP address matches 2.2.2.2... which is of course impossible! However as far as I'm aware this isn't necessary if you're using the ! negator to specify "and is not...". Kudos also to SE: this older question also came in useful when I was verifying my own knowledge prior to my futzing around with code. This page was helpful, as were the various other links posted below (can't hyperlink them all due to spam protection... other regex checkers are available). The AddedBytes cheat sheet's useful to pin up on your wall. Other referenced URLs: internetofficer.com/seo-tool/regex-tester/ fantomaster.com/faarticles/rewritingurls.txt internetofficer.com/seo-tool/regex-tester/ addedbytes.com/cheat-sheets/mod_rewrite-cheat-sheet/

    Read the article

  • Migrate ASP.Net web site from IIS6 to IIS7

    - by David.Chu.ca
    I have to migrate an ASP.Net web site from IIS6 to IIS7. I tried to copy the all files for a web site from IIS6 (c:\inetpub\wwwroot\MySite) to another box with Windows Server 2008 R2 where IIS7 is the default web server. However, the simply copy seems not working. Should I rebuild the web site for IIS7 or should I make changes on the new box with IIS7 such as web.config? Thanks for the comments. Further investigation I found that http handers seems caused exception: <!--httpHandlers> <add path="Reserved.ReportViewerWebControl.axd" verb="*" type="Microsoft.Reporting.WebForms.HttpHandler, Microsoft.ReportViewer.WebForms, Version=8.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" validate="false"/> </httpHandlers--> After I comment out the above handler in web.config, the web page works fine. This is just my initial test. I am not sure if I should rebuild the web site from source codes or not. If so, do I need to specify for IIS7?

    Read the article

  • Changing IP address in IIS for SharePoint site results in Directory listing error

    - by Dan
    I have a server here that has 2 roles. One is Exchange 2007 and the other is MOSS 2007. In IIS i have a site, go.domain.com which has our OWA. The other is internal.domain.com which is the MOSS site. I have given the NIC local IPs and each site is using host headers. The GO site has an SSL cert from NetSol, and the MOSS site has a self signed. Right now going to either shows the NetSol site, which browsers complain about when going to the internal.domain.com site, obviously, since they are on the same IP in IIS. Both sites have always run off the original IP of 10.0.0.3 in IIS. When i added the second IP to the NIC, (10.0.0.6) and changed the Sharepoint site in IIS to use this for http and https access, I now get this message in a browser when trying to connect. Directory Listing Denied This Virtual Directory does not allow contents to be listed. Changing the IP back to 10.0.0.3 and the internal site is back up. What am I missing here? Do i need to fool around with Alternate Access Mappings in Central Admin? Am i completely missing the point with multiple SSL certs and host headers?

    Read the article

  • How to publishing access DB to https SharePoint2010 site with self-signed certificate

    - by ybbest
    If you are having troubles (shown below) when you publish your access database to https SharePoint2010 site with self-signed certificate. Problem: First you are getting a warning see the screenshot below: And then getting the error message: Solution: The error “The name of the security certificate is invalid or does not match the name of the site” comes when the ‘common name’ in the certificate doesn’t match the address you provided in browser to access the site. To fix the problem , you need to use script to generate the certificate rather than using the IIS UI, this is because it will default the common name to the server name and you will have the above problem when using that certificate to a different host-name web application. You can use SelfSSl.exe (IIS 6.0 only), you have to specify common name(cn), for example as: selfssl.exe /T /N:cn=testsharepoint.com /K:1024 /V:7 /S:1 /P:443 OR you can use makecert (IIS7.0 and above) makecert -r -pe -n 'CN=my.domain.here' -b 01/01/2000 -e 01/01/2036 -eku 1.3.6.1.5.5.7.3.1 -ss my -sr localMachine -sky exchange -sp "Microsoft RSA SChannel Cryptographic Provider" -sy 12 After you have created the certificate, you then need to add that self-signed certificate to your IIS web site and to the Trusted Root Certification Authorities. (To get to there, Key-in Windows + R and Type mmc.exe and add the certifications console) I have compiled the solution from the questions I have asked in sharepointstackexchange

    Read the article

  • Google Analytics Visitors drop-off for certain region of site only

    - by crmpicco
    I have an issue with the tracking on my site where I have seen a dramatic drop off of visitors to the site from a certain region. I have four regions on my site at the moment, these are UK, EU, US and RoW (Rest of the World). The UK, EU and US regions are unaffected, only the RoW region suffers this drop-off. I have included a screen shot below from my GA account, which shows this effect. My GA code, which is included on every page on the site is below. I have changed the UA account number intentionally for this example. There have been no changes made to the GA account or the tracking code in a live environment for some considerable time, but for some reason I am seeing the drop-off for this region only. In the code below I am not tracking page views on certain pages as I have event tracking setup for these pages. <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-18721873-5']); _gaq.push(['_setCookiePath', '/row/']); if ( typeof(p_page) != 'undefined') { // do nothing if user is on above pages // N.B. there are a series of conditions in this if statement checking that we are not on a particular page } else { _gaq.push(['_trackPageview']); } </script>

    Read the article

  • Webmatrix The Site has Stopped Fix

    - by Tarun Arora
    I just got started with AzureWebSites by creating a website by choosing the Wordpress template. Next I tried to install WebMatrix so that I could run the website locally. Every time I tried to run my website from WebMatrix I hit the message “The following site has stopped ‘xxx’” Step 00 – Analysis It took a bit of time to figure out that WebMatrix makes use of IISExpress. But it was easy to figure out that IISExpress was not showing up in the system tray when I started WebMatrix. This was a good indication that IISExpress is having some trouble starting up. So, I opened CMD prompt and tried to run IISExpress.exe this resulted in the below error message So, I ran IISExpress.exe /trace:Error this gave more detailed reason for failure Step 1 – Fixing “The following site has stopped ‘xxx’” Further analysis revealed that the IIS Express config file had been corrupted. So, I navigated to C:\Users\<UserName>\Documents\IISExpress\config and deleted the files applicationhost.config, aspnet.config and redirection.config (please take a backup of these files before deleting them). Come back to CMD and run IISExpress /trace:Error IIS Express successfully started and parked itself in the system tray icon. I opened up WebMatrix and clicked Run, this time the default site successfully loaded up in the browser without any failures. Step 2 – Download WordPress Azure WebSite using WebMatrix Because the config files ‘applicationhost.config’, ‘aspnet.config’ and ‘redirection.config’ were deleted I lost the settings of my Azure based WordPress site that I had downloaded to run from WebMatrix. This was simple to sort out… Open up WebMatrix and go to the Remote tab, click on Download Export the PublishSettings file from Azure Management Portal and upload it on the pop up you get when you had clicked Download in the previous step Now you should have your Azure WordPress website all set up & running from WebMatrix. Enjoy!

    Read the article

  • How to add files to a document library in a site definition in SharePoint 2007?

    - by jaloplo
    Hi all, I'm doing a site definition for SharePoint 2007. When the site is created, a document library called "Folder2" is created also. Now, I need to add some documents to this document library and appear as items in the document library standard views. My code is: <Lists> <List FeatureId="00bfea71-e717-4e80-aa17-d0c71b360101" Type="101" Title="Folder2" Url="Folder2"> <Data> <Rows> <Row> <Field Name="Name">MyFile.txt</Field> <Field Name="Title">MyFile.txt</Field> <Field Name="FileLeafRef">MyFile.txt</Field> </Row> </Rows> </Data> </List> </Lists> When I see the items of the Document Library there is one element with title "1_". Does anybody know how to add files in a site definition? The onet.xml I used is the same as blank site. Thanks!!!

    Read the article

  • Site Action Button Missing

    - by David
    I have a Moss 2007 site collection which was running smoothly. However, after moving to Forms Authentication connected to Active Directory neither my Farm Administrator account nor the 2 Site collection administrators can get past the site homepage. There is no Site Action buttons availabe. When I log into CA witht he Farm Administrator account the Site Action button exists there so I suspect its something to do with the permissions to the site application. I have tried to connect directly to the settings URL http://mysite/_layouts/settings.aspx etc but get the ususal error Error: Access Denied Current User You are currently signed in as: xxxx (Farm Administrator Account) Sign in as a different user Request access I get the same error when logging in with the site collection admin accounts. Ive also checked to make sure that the sites are not locked out.

    Read the article

  • home page of my site is not appear in google search, when I type site:mario--games.com

    - by Bimal Das
    I am totally confused, when I place site:mario--games.com in google I see every page of my website in serp(search engine result page) except the home page. I don’t know why it’s happening but I check everything according to google SEO rule. But not getting success.I try to check my site through google webmaster tool everything is perfect there, Please help me to solve this. htaccess rule of my site RewriteEngine on #RewriteBase / #RewriteCond %{HTTP_REFERER} !^$ #RewriteCond %{HTTP_REFERER} !^http://(www\.)?jocuri-barbie.ro/.*$ [NC] #RewriteCond %{HTTP_REFERER} !^http://(www\.)?joc-jocuri-barbie.ro/.*$ [NC] #RewriteRule \.(gif|jpg|swf|flv|png)$ / [F] RewriteCond %{HTTP_HOST} ^sitename\.com$ RewriteRule ^(.*)$ http://www.sitename.com/$1 [R=301,L] RewriteRule ^privacy_policy.html$ privacy.php RewriteRule ^contact_us.html$ contact.php #RewriteCond %{HTTP_HOST} !^sitename.com$ #RewriteRule ^(.*)$ http://sitename.com/$1 [L,QSA,R=301] RewriteRule ^index\.php$ / [QSA,R=301] RewriteRule ^([A-Za-z0-9_]+)_([0-9]+).html$ articol.php?id=$2 RewriteRule ^([a-z_]+).html$ categorii.php?abbr=$1 RewriteRule ^([A-Za-z0-9+]+)$ search.php?cuvinte=$1 RewriteRule ^images/$ images [F] RewriteRule ^o_articole/$ o_articole [F]

    Read the article

  • How to Search Just the Site You’re Viewing Using Google Search

    - by The Geek
    Have you ever wanted to search the site you’re viewing, but the built-in search box is either hard to find, or doesn’t work very well? Here’s how to add a special keyword bookmark that searches the site you’re viewing using Google’s site: search operator. This technique should work in either Google Chrome or Firefox—in Firefox you’ll want to create a regular bookmark and add the script into the keyword field, and for Google Chrome just follow the steps we’ve provided below Latest Features How-To Geek ETC How to Use the Avira Rescue CD to Clean Your Infected PC The Complete List of iPad Tips, Tricks, and Tutorials Is Your Desktop Printer More Expensive Than Printing Services? 20 OS X Keyboard Shortcuts You Might Not Know HTG Explains: Which Linux File System Should You Choose? HTG Explains: Why Does Photo Paper Improve Print Quality? Simon’s Cat Explores the Christmas Tree! [Video] The Outdoor Lights Scene from National Lampoon’s Christmas Vacation [Video] The Famous Home Alone Pizza Delivery Scene [Classic Video] Chronicles of Narnia: The Voyage of the Dawn Treader Theme for Windows 7 Cardinal and Rabbit Sharing a Tree on a Cold Winter Morning Wallpaper An Alternate Star Wars Christmas Special [Video]

    Read the article

  • How google handle site traffic in google analytics

    - by Hamidreza
    I have a site with address www.exam.com and I have put Google analytics javascript scripts in it. I have made an app for my site, I want that everytime a user uses app, he visit the site in the application with built in browser which is inside the application ( I am using C# for application and .NET web browser ). User will address www.example.com/appvisit in the app and I just have put google analytics scripts in that page and nothing else. And I want to disallow this address /appvisit in my robots.txt file . I want to know that Is there any problem with doing this? will google crawl in the /appvisit directory ? Does google hate this work? and will google think this traffic is true and normal? thanks

    Read the article

< Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >