Search Results

Search found 24862 results on 995 pages for 'site verification'.

Page 15/995 | < Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >

  • How to determine the correct (case sensitive) URL for a SharePoint site

    - by Goyuix
    SharePoint is generally very tolerant of accepting a URL in a case-insensitive fashion, however there are a few cases where it completely breaks down. For example, when creating a site column it somehow stores and uses the URL when it was created, and when trying to edit the field definition through the Site Column Gallery (fldedit.aspx page in the LAYOUTS) you end up throwing the error below. Value does not fall within the expected range. at Microsoft.SharePoint.SPFieldCollection.GetFieldByInternalName(String strName, Boolean bThrowException) at Microsoft.SharePoint.SPFieldCollection.GetFieldByInternalName(String strName) at Microsoft.SharePoint.ApplicationPages.BasicFieldEditPage.OnLoad(EventArgs e) at System.Web.UI.Control.LoadRecursive() at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) How can I reliably get the correct URL for a site/web? The SPSite.Url and SPWeb.Url properties seem to return back whatever case they are instantiated with. In other words, the site collection is provisioned using the following URL: http://server/Path/Site If I create a new Site Column using the SharePoint object model and happen to use http://server/path/site when instantiating the SPSite and SPWeb objects, the site column will be made available but when trying to access it through the gallery the error above is generated. If I correct the URL in the address bar, I can still view/modify the definition for the SPField in question, but the default URL that is generated is bogus. Clear as mud? Example code: (this is a bad example because of the case sensitivity issue) // note: site should be partially caps: http://server/Path/Site using (SPSite site = new SPSite("http://server/path/site")) { using (SPWeb web = site.OpenWeb()) { web.Fields.AddFieldAsXml("..."); // correct XML really here } }

    Read the article

  • MIDlet + BlackBerry API = verification error?

    - by Kilnr
    Hi, Is there any way to write a MIDlet, but still use BlackBerry API classes and functions (including the APIs that require code signing)? In particular, I'm trying to use Kuix (http://www.kalmeo.org/projects/kuix). A pure MIDlet + Kuix (so without BlackBerry stuff) works perfectly after I converted the jar/jad to a cod file. As soon as I add BlackBerry API-stuff (CoverageInfo.COVERAGE_DIRECT in this case) I get a verification error when trying to run the cod file: Error starting $name: Module $name has verification error 2410 at offset 9a4f What can I do to solve this?

    Read the article

  • How to create managed properties at site collection level in SharePoint2013

    - by ybbest
    In SharePoint2013, you can create managed properties at site collection. Today, I’d like to show you how to do so through PowerShell. 1. Define your managed properties and crawled properties and managed property Type in an external csv file. PowerShell script will read this file and create the managed and the mapping. 2. As you can see I also defined variant Type, this is because you need the variant type to create the crawled property. In order to have the crawled properties, you need to do a full crawl and also make sure you have data populated for your custom column. However, if you do not want to a full crawl to create those crawled properties, you can create them yourself by using the PowerShell; however you need to make sure the crawled properties you created have the same name if created by a full crawl. Managed properties type: Text = 1 Integer = 2 Decimal = 3 DateTime = 4 YesNo = 5 Binary = 6 Variant Type: Text = 31 Integer = 20 Decimal = 5 DateTime = 64 YesNo = 11 3. You can use the following script to create your managed properties at site collection level, the differences for creating managed property at site collection level is to pass in the site collection id. param( [string] $siteUrl="http://SP2013/", [string] $searchAppName = "Search Service Application", $ManagedPropertiesList=(IMPORT-CSV ".\ManagedProperties.csv") ) Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue $searchapp = $null function AppendLog { param ([string] $msg, [string] $msgColor) $currentDateTime = Get-Date $msg = $msg + " --- " + $currentDateTime if (!($logOnly -eq $True)) { # write to console Write-Host -f $msgColor $msg } # write to log file Add-Content $logFilePath $msg } $scriptPath = Split-Path $myInvocation.MyCommand.Path $logFilePath = $scriptPath + "\CreateManagedProperties_Log.txt" function CreateRefiner {param ([string] $crawledName, [string] $managedPropertyName, [Int32] $variantType, [Int32] $managedPropertyType,[System.GUID] $siteID) $cat = Get-SPEnterpriseSearchMetadataCategory –Identity SharePoint -SearchApplication $searchapp $crawledproperty = Get-SPEnterpriseSearchMetadataCrawledProperty -Name $crawledName -SearchApplication $searchapp -SiteCollection $siteID if($crawledproperty -eq $null) { Write-Host AppendLog "Creating Crawled Property for $managedPropertyName" Yellow $crawledproperty = New-SPEnterpriseSearchMetadataCrawledProperty -SearchApplication $searchapp -VariantType $variantType -SiteCollection $siteID -Category $cat -PropSet "00130329-0000-0130-c000-000000131346" -Name $crawledName -IsNameEnum $false } $managedproperty = Get-SPEnterpriseSearchMetadataManagedProperty -Identity $managedPropertyName -SearchApplication $searchapp -SiteCollection $siteID -ErrorAction SilentlyContinue if($managedproperty -eq $null) { Write-Host AppendLog "Creating Managed Property for $managedPropertyName" Yellow $managedproperty = New-SPEnterpriseSearchMetadataManagedProperty -Name $managedPropertyName -Type $managedPropertyType -SiteCollection $siteID -SearchApplication $searchapp -Queryable:$true -Retrievable:$true -FullTextQueriable:$true -RemoveDuplicates:$false -RespectPriority:$true -IncludeInMd5:$true } $mappedProperty = $crawledproperty.GetMappedManagedProperties() | ?{$_.Name -eq $managedProperty.Name } if($mappedProperty -eq $null) { Write-Host AppendLog "Creating Crawled -> Managed Property mapping for $managedPropertyName" Yellow New-SPEnterpriseSearchMetadataMapping -CrawledProperty $crawledproperty -ManagedProperty $managedproperty -SearchApplication $searchapp -SiteCollection $siteID } $mappedProperty = $crawledproperty.GetMappedManagedProperties() | ?{$_.Name -eq $managedProperty.Name } #Get-FASTSearchMetadataCrawledPropertyMapping -ManagedProperty $managedproperty } $searchapp = Get-SPEnterpriseSearchServiceApplication $searchAppName $site= Get-SPSite $siteUrl $siteId=$site.id Write-Host "Start creating Managed properties" $i = 1 FOREACH ($property in $ManagedPropertiesList) { $propertyName=$property.managedPropertyName $crawledName=$property.crawledName $managedPropertyType=$property.managedPropertyType $variantType=$property.variantType Write-Host $managedPropertyType Write-Host "Processing managed property $propertyName $($i)..." $i++ CreateRefiner $crawledName $propertyName $variantType $managedPropertyType $siteId Write-Host "Managed property created " $propertyName } Key Concepts Crawled Properties: Crawled properties are discovered by the search index service component when crawling content. Managed Properties: Properties that are part of the Search user experience, which means they are available for search results, advanced search, and so on, are managed properties. Mapping Crawled Properties to Managed Properties: To make a crawled property available for the Search experience—to make it available for Search queries and display it in Advanced Search and search results—you must map it to a managed property. References Administer search in SharePoint 2013 Preview Managing Metadata New-SPEnterpriseSearchMetadataCrawledProperty New-SPEnterpriseSearchMetadataManagedProperty Remove-SPEnterpriseSearchMetadataManagedProperty Overview of crawled and managed properties in SharePoint 2013 Preview Remove-SPEnterpriseSearchMetadataManagedProperty SharePoint 2013 – Search Service Application

    Read the article

  • How to create managed properties at site collection level in SharePoint2013

    - by ybbest
    In SharePoint2013, you can create managed properties at site collection. Today, I’d like to show you how to do so through PowerShell. 1. Define your managed properties and crawled properties and managed property Type in an external csv file. PowerShell script will read this file and create the managed and the mapping. 2. As you can see I also defined variant Type, this is because you need the variant type to create the crawled property. In order to have the crawled properties, you need to do a full crawl and also make sure you have data populated for your custom column. However, if you do not want to a full crawl to create those crawled properties, you can create them yourself by using the PowerShell; however you need to make sure the crawled properties you created have the same name if created by a full crawl. Managed properties type: Text = 1 Integer = 2 Decimal = 3 DateTime = 4 YesNo = 5 Binary = 6 Variant Type: Text = 31 Integer = 20 Decimal = 5 DateTime = 64 YesNo = 11 3. You can use the following script to create your managed properties at site collection level, the differences for creating managed property at site collection level is to pass in the site collection id. param( [string] $siteUrl="http://SP2013/", [string] $searchAppName = "Search Service Application", $ManagedPropertiesList=(IMPORT-CSV ".\ManagedProperties.csv") ) Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue $searchapp = $null function AppendLog { param ([string] $msg, [string] $msgColor) $currentDateTime = Get-Date $msg = $msg + " --- " + $currentDateTime if (!($logOnly -eq $True)) { # write to console Write-Host -f $msgColor $msg } # write to log file Add-Content $logFilePath $msg } $scriptPath = Split-Path $myInvocation.MyCommand.Path $logFilePath = $scriptPath + "\CreateManagedProperties_Log.txt" function CreateRefiner {param ([string] $crawledName, [string] $managedPropertyName, [Int32] $variantType, [Int32] $managedPropertyType,[System.GUID] $siteID) $cat = Get-SPEnterpriseSearchMetadataCategory –Identity SharePoint -SearchApplication $searchapp $crawledproperty = Get-SPEnterpriseSearchMetadataCrawledProperty -Name $crawledName -SearchApplication $searchapp -SiteCollection $siteID if($crawledproperty -eq $null) { Write-Host AppendLog "Creating Crawled Property for $managedPropertyName" Yellow $crawledproperty = New-SPEnterpriseSearchMetadataCrawledProperty -SearchApplication $searchapp -VariantType $variantType -SiteCollection $siteID -Category $cat -PropSet "00130329-0000-0130-c000-000000131346" -Name $crawledName -IsNameEnum $false } $managedproperty = Get-SPEnterpriseSearchMetadataManagedProperty -Identity $managedPropertyName -SearchApplication $searchapp -SiteCollection $siteID -ErrorAction SilentlyContinue if($managedproperty -eq $null) { Write-Host AppendLog "Creating Managed Property for $managedPropertyName" Yellow $managedproperty = New-SPEnterpriseSearchMetadataManagedProperty -Name $managedPropertyName -Type $managedPropertyType -SiteCollection $siteID -SearchApplication $searchapp -Queryable:$true -Retrievable:$true -FullTextQueriable:$true -RemoveDuplicates:$false -RespectPriority:$true -IncludeInMd5:$true } $mappedProperty = $crawledproperty.GetMappedManagedProperties() | ?{$_.Name -eq $managedProperty.Name } if($mappedProperty -eq $null) { Write-Host AppendLog "Creating Crawled -> Managed Property mapping for $managedPropertyName" Yellow New-SPEnterpriseSearchMetadataMapping -CrawledProperty $crawledproperty -ManagedProperty $managedproperty -SearchApplication $searchapp -SiteCollection $siteID } $mappedProperty = $crawledproperty.GetMappedManagedProperties() | ?{$_.Name -eq $managedProperty.Name } #Get-FASTSearchMetadataCrawledPropertyMapping -ManagedProperty $managedproperty } $searchapp = Get-SPEnterpriseSearchServiceApplication $searchAppName $site= Get-SPSite $siteUrl $siteId=$site.id Write-Host "Start creating Managed properties" $i = 1 FOREACH ($property in $ManagedPropertiesList) { $propertyName=$property.managedPropertyName $crawledName=$property.crawledName $managedPropertyType=$property.managedPropertyType $variantType=$property.variantType Write-Host $managedPropertyType Write-Host "Processing managed property $propertyName $($i)..." $i++ CreateRefiner $crawledName $propertyName $variantType $managedPropertyType $siteId Write-Host "Managed property created " $propertyName } Key Concepts Crawled Properties: Crawled properties are discovered by the search index service component when crawling content. Managed Properties: Properties that are part of the Search user experience, which means they are available for search results, advanced search, and so on, are managed properties. Mapping Crawled Properties to Managed Properties: To make a crawled property available for the Search experience—to make it available for Search queries and display it in Advanced Search and search results—you must map it to a managed property. References Administer search in SharePoint 2013 Preview Managing Metadata

    Read the article

  • SharePoint 2010 Hosting :: Error – HTTP Error 401.1 when Accessing Your SharePoint 2010 Site

    - by mbridge
    When attempting to view a MOSS (SharePoint) 2007 or SharePoint 2010 site locally from a Web Front End (WFE) you get an error stating: “HTTP Error 401.1 – Unauthorized: Access is denied due to invalid credentials.” I have noticed that this happens on Windows 2003/2008 Server SP1/SP2/R2 when using Host Headers and Alternate Access Mappings on a web application in MOSS 2007. If you can access the site from remote machines and cannot access the site from the server itself, then this might be your issue. For all my newer farm installs this includes SharePoint 2007 (MOSS) and SharePoint 2010. I use method number 2 on all SharePoint and SQL Servers in the farm. If you cannot access the web site locally or remotely from other machines then there is an issue with security on the site and/or possibly a Kerberos related security issue I implemented fix #2 listed in the following Microsoft KB Article. I implemented this fix on all servers in the MOSS 2007 Farm (WFE’s and Indexing/Search Server). If using method 1, you would add all Host Headers and Alternate Access Mappings for all web applications to the BackConnectionHostNames value, then you will be able to access the sites locally from the WFE’s. Microsoft KB Link: http://support.microsoft.com/kb/896861 Method 1: Specify Host Names Please follow this steps: 1. Click Start, click Run, type regedit, and then click OK. 2. In Registry Editor, locate and then click the following registry key: HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Lsa\MSV1_0 3. Right-click MSV1_0, point to New, and then click Multi-String Value. 4. Type BackConnectionHostNames, and then press ENTER. 5. Right-click BackConnectionHostNames, and then click Modify. 6. In the Value data box, type the host name or the host names for the sites that are on the local computer, and then click OK. 7. Quit Registry Editor, and then restart the IISAdmin service. Method 2: Disable the Loopback Check  Please follow this steps: 1. Click Start, click Run, type regedit, and then click OK 2. In Registry Editor, locate and then click the following registry key: HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Lsa 3. Right-click Lsa, point to New, and then click DWORD Value. 4. Type DisableLoopbackCheck, and then press ENTER. 5. Right-click DisableLoopbackCheck, and then click Modify. 6. In the Value data box, type 1, and then click OK. 7. Quit Registry Editor, and then restart your computer. Give it try and good luck.

    Read the article

  • Redirect all access requests to a domain and subdomain(s) except from specific IP address? [closed]

    - by Christopher
    This is a self-answered question... After much wrangling I found the magic combination of mod_rewrite rules so I'm posting here. My scenario is that I have two domains - domain1.com and domain2.com - both of which are currently serving identical content (by way of a global 301 redirect from domain1 to domain2). Domain1 was then chosen to be repurposed to be a 'portal' domain - with a corporate CMS-based site leading off from the front page, and the existing 'retail' domain (domain2) left to serve the main web site. In addition, a staging subdomain was created on domain1 in order to prepare the new corporate site without impinging on the root domain's existing operation. I contemplated just rewriting all requests to domain2 and setting up the new corporate site 'behind the scenes' without using a staging domain, but I usually use subdomains when setting up new sites. Finally, I required access to the 'actual' contents of the domains and subdomains - i.e., to not be redirected like all other visitors - in order that I can develop the new site and test it in the staging environment on the live server, as I'm not using a separate development webserver in this case. I also have another test subdomain on domain1 which needed to be preserved. The way I eventually set it up was as follows: (10.2.2.1 would be my home WAN IP) .htaccess in root of domain1 RewriteEngine On RewriteCond %{REMOTE_ADDR} !^10\.2\.2\.1 RewriteCond %{HTTP_HOST} !^staging.domain1.com$ [NC] RewriteCond %{HTTP_HOST} !^staging2.domain1.com$ [NC] RewriteRule ^(.*)$ http://domain2.com/$1 [R=301] .htaccess in staging subdomain on domain1: RewriteEngine On RewriteCond %{REMOTE_ADDR} !^10\.2\.2\.1 RewriteCond %{HTTP_HOST} ^staging.revolver.coop$ [NC] RewriteRule ^(.*)$ http://domain2.com/$1 [R=301,L] The multiple .htaccess files and multiple rulesets require more processing overhead and longer iteration as the visitor is potentially redirected twice, however I find it to be a more granular method of control as I can selectively allow more than one IP address access to individual staging subdomain(s) without automatically granting them access to everything else. It also keeps the rulesets fairly simple and easy to read. (or re-interpret, because I'm always forgetting how I put rules together!) If anybody can suggest a more efficient way of merging all these rules and conditions into just one main ruleset in the root of domain1, please post! I'm always keen to learn, this post is more my attempt to preserve this information for those who are looking to redirect entire domains for all visitors except themselves (for design/testing purposes) and not just denying specific file access for maintenance mode (there are many good examples of simple mod_rewrite rules for 'maintenance mode' style operation easily findable via Google). You can also extend the IP address detection - firstly by using wildcards ^10\.2\.2\..*: the last octet's \..* denotes the usual "." and then "zero or more arbitrary characters", signified by the .* - so you can specify specific ranges of IPs in a subnet or entire subnets if you wish. You can also use square brackets: ^10\.2\.[1-255]\.[120-140]; ^10\.2\.[1-9]?[0-9]\.; ^10\.2\.1[0-1][0-9]\. etc. The third way, if you wish to specify multiple discrete IP addresses, is to bracket them in the style of ^(1.1.1.1|2.2.2.2|3.3.3.3)$, and you can of course use square brackets to substitute octets or single digits again. NB: if you're using individual RewriteCond lines to specify multiple IPs / ranges, make sure to put [OR] at the end of each one otherwise mod_rewrite will interpret as "if IP address matches 1.1.1.1 AND if IP address matches 2.2.2.2... which is of course impossible! However as far as I'm aware this isn't necessary if you're using the ! negator to specify "and is not...". Kudos also to SE: this older question also came in useful when I was verifying my own knowledge prior to my futzing around with code. This page was helpful, as were the various other links posted below (can't hyperlink them all due to spam protection... other regex checkers are available). The AddedBytes cheat sheet's useful to pin up on your wall. Other referenced URLs: internetofficer.com/seo-tool/regex-tester/ fantomaster.com/faarticles/rewritingurls.txt internetofficer.com/seo-tool/regex-tester/ addedbytes.com/cheat-sheets/mod_rewrite-cheat-sheet/

    Read the article

  • openvpn WARNING: No server certificate verification method has been enabled

    - by tmedtcom
    I tried to install openvpn on debian squeez (server) and connect from my fedora 17 as (client). Here is my configuration: server configuration ###cat server.conf # Serveur TCP ** proto tcp** port 1194 dev tun # Cles et certificats ca /etc/openvpn/easy-rsa/keys/ca.crt cert /etc/openvpn/easy-rsa/keys/server.crt key /etc/openvpn/easy-rsa/keys/server.key dh /etc/openvpn/easy-rsa/keys/dh1024.pem # Reseau #Adresse virtuel du reseau vpn server 192.170.70.0 255.255.255.0 #Cette ligne ajoute sur le client la route du reseau vers le serveur push "route 192.168.1.0 255.255.255.0" #Creer une route du server vers l'interface tun. #route 192.170.70.0 255.255.255.0 # Securite keepalive 10 120 #type d'encryptage des données **cipher AES-128-CBC** #activation de la compression comp-lzo #nombre maximum de clients autorisés max-clients 10 #pas d'utilisateur et groupe particuliers pour l'utilisation du VPN user nobody group nogroup #pour rendre la connexion persistante persist-key persist-tun #Log d'etat d'OpenVPN status /var/log/openvpn-status.log #logs openvpnlog /var/log/openvpn.log log-append /var/log/openvpn.log #niveau de verbosité verb 5 ###cat client.conf # Client client dev tun [COLOR="Red"]proto tcp-client[/COLOR] remote <my server wan IP> 1194 resolv-retry infinite **cipher AES-128-CBC** # Cles ca ca.crt cert client.crt key client.key # Securite nobind persist-key persist-tun comp-lzo verb 3 Message from the host client (fedora 17) in the log file / var / log / messages: Dec 6 21:56:00 GlobalTIC NetworkManager[691]: <info> Starting VPN service 'openvpn'... Dec 6 21:56:00 GlobalTIC NetworkManager[691]: <info> VPN service 'openvpn' started (org.freedesktop.NetworkManager.openvpn), PID 7470 Dec 6 21:56:00 GlobalTIC NetworkManager[691]: <info> VPN service 'openvpn' appeared; activating connections Dec 6 21:56:00 GlobalTIC NetworkManager[691]: <info> VPN plugin state changed: starting (3) Dec 6 21:56:01 GlobalTIC NetworkManager[691]: <info> VPN connection 'Connexion VPN 1' (Connect) reply received. Dec 6 21:56:01 GlobalTIC nm-openvpn[7472]: OpenVPN 2.2.2 x86_64-redhat-linux-gnu [SSL] [LZO2] [EPOLL] [PKCS11] [eurephia] built on Sep 5 2012 Dec 6 21:56:01 GlobalTIC nm-openvpn[7472]:[COLOR="Red"][U][B] WARNING: No server certificate verification method has been enabled.[/B][/U][/COLOR] See http://openvpn.net/howto.html#mitm for more info. Dec 6 21:56:01 GlobalTIC nm-openvpn[7472]: NOTE: the current --script-security setting may allow this configuration to call user-defined scripts Dec 6 21:56:01 GlobalTIC nm-openvpn[7472]:[COLOR="Red"] WARNING: file '/home/login/client/client.key' is group or others accessible[/COLOR] Dec 6 21:56:01 GlobalTIC nm-openvpn[7472]: UDPv4 link local: [undef] Dec 6 21:56:01 GlobalTIC nm-openvpn[7472]: UDPv4 link remote: [COLOR="Red"]<my server wan IP>[/COLOR]:1194 Dec 6 21:56:01 GlobalTIC nm-openvpn[7472]: [COLOR="Red"]read UDPv4 [ECONNREFUSED]: Connection refused (code=111)[/COLOR] Dec 6 21:56:03 GlobalTIC nm-openvpn[7472]: [COLOR="Red"]read UDPv4[/COLOR] [ECONNREFUSED]: Connection refused (code=111) Dec 6 21:56:07 GlobalTIC nm-openvpn[7472]: read UDPv4 [ECONNREFUSED]: Connection refused (code=111) Dec 6 21:56:15 GlobalTIC nm-openvpn[7472]: read UDPv4 [ECONNREFUSED]: Connection refused (code=111) Dec 6 21:56:31 GlobalTIC nm-openvpn[7472]: read UDPv4 [ECONNREFUSED]: Connection refused (code=111) Dec 6 21:56:41 GlobalTIC NetworkManager[691]: <warn> VPN connection 'Connexion VPN 1' (IP Conf[/CODE] ifconfig on server host(debian): ifconfig eth0 Link encap:Ethernet HWaddr 08:00:27:16:21:ac inet addr:192.168.1.6 Bcast:192.168.1.255 Mask:255.255.255.0 inet6 addr: fe80::a00:27ff:fe16:21ac/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:9059 errors:0 dropped:0 overruns:0 frame:0 TX packets:5660 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:919427 (897.8 KiB) TX bytes:1273891 (1.2 MiB) tun0 Link encap:UNSPEC HWaddr 00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00 inet addr:192.170.70.1 P-t-P:192.170.70.2 Mask:255.255.255.255 UP POINTOPOINT RUNNING NOARP MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:0 TX packets:0 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:100 RX bytes:0 (0.0 B) TX bytes:0 (0.0 B) ifconfig on the client host (fedora 17) as0t0: flags=4305<UP,POINTOPOINT,RUNNING,NOARP,MULTICAST> mtu 1500 inet 5.5.0.1 netmask 255.255.252.0 destination 5.5.0.1 unspec 00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00 txqueuelen 200 (UNSPEC) RX packets 0 bytes 0 (0.0 B) RX errors 0 dropped 0 overruns 0 frame 0 TX packets 2 bytes 321 (321.0 B) TX errors 0 dropped 0 overruns 0 carrier 0 collisions 0 as0t1: flags=4305<UP,POINTOPOINT,RUNNING,NOARP,MULTICAST> mtu 1500 inet 5.5.4.1 netmask 255.255.252.0 destination 5.5.4.1 unspec 00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00 txqueuelen 200 (UNSPEC) RX packets 0 bytes 0 (0.0 B) RX errors 0 dropped 0 overruns 0 frame 0 TX packets 2 bytes 321 (321.0 B) TX errors 0 dropped 0 overruns 0 carrier 0 collisions 0 as0t2: flags=4305<UP,POINTOPOINT,RUNNING,NOARP,MULTICAST> mtu 1500 inet 5.5.8.1 netmask 255.255.252.0 destination 5.5.8.1 unspec 00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00 txqueuelen 200 (UNSPEC) RX packets 0 bytes 0 (0.0 B) RX errors 0 dropped 0 overruns 0 frame 0 TX packets 2 bytes 321 (321.0 B) TX errors 0 dropped 0 overruns 0 carrier 0 collisions 0 as0t3: flags=4305<UP,POINTOPOINT,RUNNING,NOARP,MULTICAST> mtu 1500 inet 5.5.12.1 netmask 255.255.252.0 destination 5.5.12.1 unspec 00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00 txqueuelen 200 (UNSPEC) RX packets 0 bytes 0 (0.0 B) RX errors 0 dropped 0 overruns 0 frame 0 TX packets 2 bytes 321 (321.0 B) TX errors 0 dropped 0 overruns 0 carrier 0 collisions 0 **p255p1**: flags=4163<UP,BROADCAST,RUNNING,MULTICAST> mtu 1500 inet 192.168.1.2 netmask 255.255.255.0 broadcast 192.168.1.255 inet6 fe80::21d:baff:fe20:b7e6 prefixlen 64 scopeid 0x20<link> ether 00:1d:ba:20:b7:e6 txqueuelen 1000 (Ethernet) RX packets 4842070 bytes 3579798184 (3.3 GiB) RX errors 0 dropped 0 overruns 0 frame 0 TX packets 3996158 bytes 2436442882 (2.2 GiB) TX errors 0 dropped 0 overruns 0 carrier 0 collisions 0 device interrupt 16 p255p1 is label for eth0 interface and on the server : root@hoteserver:/etc/openvpn# tree . +-- client ¦** +-- ca.crt ¦** +-- client.conf ¦** +-- client.crt ¦** +-- client.csr ¦** +-- client.key ¦** +-- client.ovpn ¦* ¦** +-- easy-rsa ¦** +-- build-ca ¦** +-- build-dh ¦** +-- build-inter ¦** +-- build-key ¦** +-- build-key-pass ¦** +-- build-key-pkcs12 ¦** +-- build-key-server ¦** +-- build-req ¦** +-- build-req-pass ¦** +-- clean-all ¦** +-- inherit-inter ¦** +-- keys ¦** ¦** +-- 01.pem ¦** ¦** +-- 02.pem ¦** ¦** +-- ca.crt ¦** ¦** +-- ca.key ¦** ¦** +-- client.crt ¦** ¦** +-- client.csr ¦** ¦** +-- client.key ¦** ¦** +-- dh1024.pem ¦** ¦** +-- index.txt ¦** ¦** +-- index.txt.attr ¦** ¦** +-- index.txt.attr.old ¦** ¦** +-- index.txt.old ¦** ¦** +-- serial ¦** ¦** +-- serial.old ¦** ¦** +-- server.crt ¦** ¦** +-- server.csr ¦** ¦** +-- server.key ¦** +-- list-crl ¦** +-- Makefile ¦** +-- openssl-0.9.6.cnf.gz ¦** +-- openssl.cnf ¦** +-- pkitool ¦** +-- README.gz ¦** +-- revoke-full ¦** +-- sign-req ¦** +-- vars ¦** +-- whichopensslcnf +-- openvpn.log +-- openvpn-status.log +-- server.conf +-- update-resolv-conf on the client: [login@hoteclient openvpn]$ tree . |-- easy-rsa | |-- 1.0 | | |-- build-ca | | |-- build-dh | | |-- build-inter | | |-- build-key | | |-- build-key-pass | | |-- build-key-pkcs12 | | |-- build-key-server | | |-- build-req | | |-- build-req-pass | | |-- clean-all | | |-- list-crl | | |-- make-crl | | |-- openssl.cnf | | |-- README | | |-- revoke-crt | | |-- revoke-full | | |-- sign-req | | `-- vars | `-- 2.0 | |-- build-ca | |-- build-dh | |-- build-inter | |-- build-key | |-- build-key-pass | |-- build-key-pkcs12 | |-- build-key-server | |-- build-req | |-- build-req-pass | |-- clean-all | |-- inherit-inter | |-- keys [error opening dir] | |-- list-crl | |-- Makefile | |-- openssl-0.9.6.cnf | |-- openssl-0.9.8.cnf | |-- openssl-1.0.0.cnf | |-- pkitool | |-- README | |-- revoke-full | |-- sign-req | |-- vars | `-- whichopensslcnf |-- keys -> ./easy-rsa/2.0/keys/ `-- server.conf the problem source is cipher AES-128-CBC ,proto tcp-client or UDP or the interface p255p1 on fedora17 or file authentification ta.key is not found ????

    Read the article

  • Migrate ASP.Net web site from IIS6 to IIS7

    - by David.Chu.ca
    I have to migrate an ASP.Net web site from IIS6 to IIS7. I tried to copy the all files for a web site from IIS6 (c:\inetpub\wwwroot\MySite) to another box with Windows Server 2008 R2 where IIS7 is the default web server. However, the simply copy seems not working. Should I rebuild the web site for IIS7 or should I make changes on the new box with IIS7 such as web.config? Thanks for the comments. Further investigation I found that http handers seems caused exception: <!--httpHandlers> <add path="Reserved.ReportViewerWebControl.axd" verb="*" type="Microsoft.Reporting.WebForms.HttpHandler, Microsoft.ReportViewer.WebForms, Version=8.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" validate="false"/> </httpHandlers--> After I comment out the above handler in web.config, the web page works fine. This is just my initial test. I am not sure if I should rebuild the web site from source codes or not. If so, do I need to specify for IIS7?

    Read the article

  • Verification of files on backup media - after the backup

    - by Greg Sansom
    I have a system in place where I back up my work files daily to a portable hard drive. I actually have two portable hard drives - one is stored off-site and I swap them regularly. I also keep my family photos and other historical files backed up, but I only back the photos up occasionally (ie when I have new photos). The backup media is for backup only, and it is unlikely I will ever read the files from the backup media unless a disaster occurs and I lose the master. It worries me that my backed up files could become corrupt without me knowing it. It is also feasible that my master files could become corrupt, and eventually the corrupt files would be replicated to the backup media. I'm currently using Cobian Backup, but I'm open to alternatives. Is there a tool I can use to confirm that the backed up files are identical to the files that were first copied? I know it would be possible to generate a checksum and periodically validate the backup files against the original checksum, but I'm looking for a tool which will do this automatically.

    Read the article

  • Changing IP address in IIS for SharePoint site results in Directory listing error

    - by Dan
    I have a server here that has 2 roles. One is Exchange 2007 and the other is MOSS 2007. In IIS i have a site, go.domain.com which has our OWA. The other is internal.domain.com which is the MOSS site. I have given the NIC local IPs and each site is using host headers. The GO site has an SSL cert from NetSol, and the MOSS site has a self signed. Right now going to either shows the NetSol site, which browsers complain about when going to the internal.domain.com site, obviously, since they are on the same IP in IIS. Both sites have always run off the original IP of 10.0.0.3 in IIS. When i added the second IP to the NIC, (10.0.0.6) and changed the Sharepoint site in IIS to use this for http and https access, I now get this message in a browser when trying to connect. Directory Listing Denied This Virtual Directory does not allow contents to be listed. Changing the IP back to 10.0.0.3 and the internal site is back up. What am I missing here? Do i need to fool around with Alternate Access Mappings in Central Admin? Am i completely missing the point with multiple SSL certs and host headers?

    Read the article

  • How to publishing access DB to https SharePoint2010 site with self-signed certificate

    - by ybbest
    If you are having troubles (shown below) when you publish your access database to https SharePoint2010 site with self-signed certificate. Problem: First you are getting a warning see the screenshot below: And then getting the error message: Solution: The error “The name of the security certificate is invalid or does not match the name of the site” comes when the ‘common name’ in the certificate doesn’t match the address you provided in browser to access the site. To fix the problem , you need to use script to generate the certificate rather than using the IIS UI, this is because it will default the common name to the server name and you will have the above problem when using that certificate to a different host-name web application. You can use SelfSSl.exe (IIS 6.0 only), you have to specify common name(cn), for example as: selfssl.exe /T /N:cn=testsharepoint.com /K:1024 /V:7 /S:1 /P:443 OR you can use makecert (IIS7.0 and above) makecert -r -pe -n 'CN=my.domain.here' -b 01/01/2000 -e 01/01/2036 -eku 1.3.6.1.5.5.7.3.1 -ss my -sr localMachine -sky exchange -sp "Microsoft RSA SChannel Cryptographic Provider" -sy 12 After you have created the certificate, you then need to add that self-signed certificate to your IIS web site and to the Trusted Root Certification Authorities. (To get to there, Key-in Windows + R and Type mmc.exe and add the certifications console) I have compiled the solution from the questions I have asked in sharepointstackexchange

    Read the article

  • Google Analytics Visitors drop-off for certain region of site only

    - by crmpicco
    I have an issue with the tracking on my site where I have seen a dramatic drop off of visitors to the site from a certain region. I have four regions on my site at the moment, these are UK, EU, US and RoW (Rest of the World). The UK, EU and US regions are unaffected, only the RoW region suffers this drop-off. I have included a screen shot below from my GA account, which shows this effect. My GA code, which is included on every page on the site is below. I have changed the UA account number intentionally for this example. There have been no changes made to the GA account or the tracking code in a live environment for some considerable time, but for some reason I am seeing the drop-off for this region only. In the code below I am not tracking page views on certain pages as I have event tracking setup for these pages. <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-18721873-5']); _gaq.push(['_setCookiePath', '/row/']); if ( typeof(p_page) != 'undefined') { // do nothing if user is on above pages // N.B. there are a series of conditions in this if statement checking that we are not on a particular page } else { _gaq.push(['_trackPageview']); } </script>

    Read the article

  • Webmatrix The Site has Stopped Fix

    - by Tarun Arora
    I just got started with AzureWebSites by creating a website by choosing the Wordpress template. Next I tried to install WebMatrix so that I could run the website locally. Every time I tried to run my website from WebMatrix I hit the message “The following site has stopped ‘xxx’” Step 00 – Analysis It took a bit of time to figure out that WebMatrix makes use of IISExpress. But it was easy to figure out that IISExpress was not showing up in the system tray when I started WebMatrix. This was a good indication that IISExpress is having some trouble starting up. So, I opened CMD prompt and tried to run IISExpress.exe this resulted in the below error message So, I ran IISExpress.exe /trace:Error this gave more detailed reason for failure Step 1 – Fixing “The following site has stopped ‘xxx’” Further analysis revealed that the IIS Express config file had been corrupted. So, I navigated to C:\Users\<UserName>\Documents\IISExpress\config and deleted the files applicationhost.config, aspnet.config and redirection.config (please take a backup of these files before deleting them). Come back to CMD and run IISExpress /trace:Error IIS Express successfully started and parked itself in the system tray icon. I opened up WebMatrix and clicked Run, this time the default site successfully loaded up in the browser without any failures. Step 2 – Download WordPress Azure WebSite using WebMatrix Because the config files ‘applicationhost.config’, ‘aspnet.config’ and ‘redirection.config’ were deleted I lost the settings of my Azure based WordPress site that I had downloaded to run from WebMatrix. This was simple to sort out… Open up WebMatrix and go to the Remote tab, click on Download Export the PublishSettings file from Azure Management Portal and upload it on the pop up you get when you had clicked Download in the previous step Now you should have your Azure WordPress website all set up & running from WebMatrix. Enjoy!

    Read the article

  • How to add files to a document library in a site definition in SharePoint 2007?

    - by jaloplo
    Hi all, I'm doing a site definition for SharePoint 2007. When the site is created, a document library called "Folder2" is created also. Now, I need to add some documents to this document library and appear as items in the document library standard views. My code is: <Lists> <List FeatureId="00bfea71-e717-4e80-aa17-d0c71b360101" Type="101" Title="Folder2" Url="Folder2"> <Data> <Rows> <Row> <Field Name="Name">MyFile.txt</Field> <Field Name="Title">MyFile.txt</Field> <Field Name="FileLeafRef">MyFile.txt</Field> </Row> </Rows> </Data> </List> </Lists> When I see the items of the Document Library there is one element with title "1_". Does anybody know how to add files in a site definition? The onet.xml I used is the same as blank site. Thanks!!!

    Read the article

  • Site Action Button Missing

    - by David
    I have a Moss 2007 site collection which was running smoothly. However, after moving to Forms Authentication connected to Active Directory neither my Farm Administrator account nor the 2 Site collection administrators can get past the site homepage. There is no Site Action buttons availabe. When I log into CA witht he Farm Administrator account the Site Action button exists there so I suspect its something to do with the permissions to the site application. I have tried to connect directly to the settings URL http://mysite/_layouts/settings.aspx etc but get the ususal error Error: Access Denied Current User You are currently signed in as: xxxx (Farm Administrator Account) Sign in as a different user Request access I get the same error when logging in with the site collection admin accounts. Ive also checked to make sure that the sites are not locked out.

    Read the article

  • home page of my site is not appear in google search, when I type site:mario--games.com

    - by Bimal Das
    I am totally confused, when I place site:mario--games.com in google I see every page of my website in serp(search engine result page) except the home page. I don’t know why it’s happening but I check everything according to google SEO rule. But not getting success.I try to check my site through google webmaster tool everything is perfect there, Please help me to solve this. htaccess rule of my site RewriteEngine on #RewriteBase / #RewriteCond %{HTTP_REFERER} !^$ #RewriteCond %{HTTP_REFERER} !^http://(www\.)?jocuri-barbie.ro/.*$ [NC] #RewriteCond %{HTTP_REFERER} !^http://(www\.)?joc-jocuri-barbie.ro/.*$ [NC] #RewriteRule \.(gif|jpg|swf|flv|png)$ / [F] RewriteCond %{HTTP_HOST} ^sitename\.com$ RewriteRule ^(.*)$ http://www.sitename.com/$1 [R=301,L] RewriteRule ^privacy_policy.html$ privacy.php RewriteRule ^contact_us.html$ contact.php #RewriteCond %{HTTP_HOST} !^sitename.com$ #RewriteRule ^(.*)$ http://sitename.com/$1 [L,QSA,R=301] RewriteRule ^index\.php$ / [QSA,R=301] RewriteRule ^([A-Za-z0-9_]+)_([0-9]+).html$ articol.php?id=$2 RewriteRule ^([a-z_]+).html$ categorii.php?abbr=$1 RewriteRule ^([A-Za-z0-9+]+)$ search.php?cuvinte=$1 RewriteRule ^images/$ images [F] RewriteRule ^o_articole/$ o_articole [F]

    Read the article

  • How to Search Just the Site You’re Viewing Using Google Search

    - by The Geek
    Have you ever wanted to search the site you’re viewing, but the built-in search box is either hard to find, or doesn’t work very well? Here’s how to add a special keyword bookmark that searches the site you’re viewing using Google’s site: search operator. This technique should work in either Google Chrome or Firefox—in Firefox you’ll want to create a regular bookmark and add the script into the keyword field, and for Google Chrome just follow the steps we’ve provided below Latest Features How-To Geek ETC How to Use the Avira Rescue CD to Clean Your Infected PC The Complete List of iPad Tips, Tricks, and Tutorials Is Your Desktop Printer More Expensive Than Printing Services? 20 OS X Keyboard Shortcuts You Might Not Know HTG Explains: Which Linux File System Should You Choose? HTG Explains: Why Does Photo Paper Improve Print Quality? Simon’s Cat Explores the Christmas Tree! [Video] The Outdoor Lights Scene from National Lampoon’s Christmas Vacation [Video] The Famous Home Alone Pizza Delivery Scene [Classic Video] Chronicles of Narnia: The Voyage of the Dawn Treader Theme for Windows 7 Cardinal and Rabbit Sharing a Tree on a Cold Winter Morning Wallpaper An Alternate Star Wars Christmas Special [Video]

    Read the article

  • How google handle site traffic in google analytics

    - by Hamidreza
    I have a site with address www.exam.com and I have put Google analytics javascript scripts in it. I have made an app for my site, I want that everytime a user uses app, he visit the site in the application with built in browser which is inside the application ( I am using C# for application and .NET web browser ). User will address www.example.com/appvisit in the app and I just have put google analytics scripts in that page and nothing else. And I want to disallow this address /appvisit in my robots.txt file . I want to know that Is there any problem with doing this? will google crawl in the /appvisit directory ? Does google hate this work? and will google think this traffic is true and normal? thanks

    Read the article

  • home page of my site is not appear in google search, when I type site:example.com

    - by Bimal Das
    I am totally confused, when I place site:example.com in google I see every page of my website in serp(search engine result page) except the home page. I don’t know why it’s happening but I check everything according to google SEO rule. But not getting success.I try to check my site through google webmaster tool everything is perfect there, Please help me to solve this. htaccess rule of my site RewriteEngine on #RewriteBase / #RewriteCond %{HTTP_REFERER} !^$ #RewriteCond %{HTTP_REFERER} !^http://(www\.)?example.ro/.*$ [NC] #RewriteCond %{HTTP_REFERER} !^http://(www\.)?exa-example.ro/.*$ [NC] #RewriteRule \.(gif|jpg|swf|flv|png)$ / [F] RewriteCond %{HTTP_HOST} ^sitename\.com$ RewriteRule ^(.*)$ http://www.sitename.com/$1 [R=301,L] RewriteRule ^privacy_policy.html$ privacy.php RewriteRule ^contact_us.html$ contact.php #RewriteCond %{HTTP_HOST} !^sitename.com$ #RewriteRule ^(.*)$ http://sitename.com/$1 [L,QSA,R=301] RewriteRule ^index\.php$ / [QSA,R=301] RewriteRule ^([A-Za-z0-9_]+)_([0-9]+).html$ articol.php?id=$2 RewriteRule ^([a-z_]+).html$ categorii.php?abbr=$1 RewriteRule ^([A-Za-z0-9+]+)$ search.php?cuvinte=$1 RewriteRule ^images/$ images [F] RewriteRule ^o_articole/$ o_articole [F]

    Read the article

  • Add second site through iframe

    - by Anna Danson
    I have two blogs on Tumblr. Let's call them Pets.Tumblr.com and Cats.Tumblr.com A while ago I decided to make Pets.tumblr.com my main blog, but since Cats.tumblr.com grew more popular, I decided to merge these sites together. What I have done is this: I've created a blank page on pets.tumblr.com/cats, put a full sized iframe with cats.tumblr.com as source, and a jquery redirect script in cats.tumblr.com that redirects to pets.tumblr.com/cats I'm wondering if this would impact my site negatively? Will search engines see pets.tumblr.com/cats as a blank site (iframes are ignored?) and cats.tumblr.com as a spam site because it redirects to a blank one?

    Read the article

  • Tracking multiple subdomains and domains going to the same site, separately in Google Analytics

    - by miles
    I have a new site that has multiple top-level domains and subdomains all going to it: www.domain.com, campaign.domain.com, chicago.domain.com, domain2.com - all go to the same site/site directory. Right now I have one Google Analytics account profile set up for it, but I want to be able to track the traffic that is hitting those different URLs separately. The domains are being routed on the server-side (not .htaccess). How can I do this in Google Analytics? Do I need to create filters? Or create different profiles for each domain?

    Read the article

  • Parsing google site speed in analytics

    - by Kevin Burke
    I'm having a hard time making heads or tails of the Site Speed graphs in Google Analytics. Our site speed is fluctuating wildly from month to month, despite a large sample (the report is "based on 100,000's of visits) and a consistent web set up (static files served from an EC2 instance running nginx behind a load balancer). Here's our site speed, with each datapoint representing a week worth of data. Over this time period we modified our source and HTTP headers to increase our cache hits on static resources by 5x. Why would it fluctuate so much? Is there any way to get more reliable information from those graphs?

    Read the article

  • Why my site not linking with google.com?

    - by nishant
    i am very tired about my website ranking in google. i am dong hard work about it but not getting anywhere my site in google. actually i am web master of a www.panbeli.in matrimony website INDIA. i am trying to improve its visibility in google last 5 month but not getting any positive result. but other search engine giving good result like yahoo and bing but google showing no any result in top 20 page result. my website is www.panbeli.in and my keyword are- bari samaj bari matrimony bari community bari shadi panbeli please help me if you can, b'coz i am very frustrated about it. my domain age is 4 years when i type link:panbeli.in in google search does not appear any pages from my site in google. whats the meaning is that?my site does not indexed in google?

    Read the article

  • Choosing a subject to create a site for

    - by Daniel
    For 6 years I'm working as web developer and I would like to finally create my own site. I thought I have everything needed: programming and administration skills, found a good designer, own servers and understanding how to operate site. But I've missed very important (or the most important) point: how do I find a subject for my site? My hobby and my work are the same: IT, but I don't want to create just another tech blog or news aggregator, I want something different. First I thought things like Google Trends or Google top 1000 could help, but I've got lost in thousands of options I can't see them all (I actually can, but it'll take at least a couple of month). So my question is: how did you start? Where did you get the idea?

    Read the article

< Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >