Search Results

Search found 15415 results on 617 pages for 'security groups'.

Page 95/617 | < Previous Page | 91 92 93 94 95 96 97 98 99 100 101 102  | Next Page >

  • Intermittent PolicyException: Execution permission cannot be acquired.

    - by Aaron Maenpaa
    We are intermittently seeing the following exception shortly after an App Pool recycle in an ASP.NET application: System.Configuration.ConfigurationErrorsException: Could not load file or assembly 'Microsoft.Web.Mvc, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null' or one of its dependencies. Failed to grant permission to execute. (Exception from HRESULT: 0x80131418) ---> System.IO.FileLoadException: Could not load file or assembly 'Microsoft.Web.Mvc, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null' or one of its dependencies. Failed to grant permission to execute. (Exception from HRESULT: 0x80131418) File name: 'Microsoft.Web.Mvc, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null' ---> System.Security.Policy.PolicyException: Execution permission cannot be acquired. at System.Security.SecurityManager.ResolvePolicy(Evidence evidence, PermissionSet reqdPset, PermissionSet optPset, PermissionSet denyPset, PermissionSet& denied, Boolean checkExecutionPermission) at System.Security.SecurityManager.ResolvePolicy(Evidence evidence, PermissionSet reqdPset, PermissionSet optPset, PermissionSet denyPset, PermissionSet& denied, Int32& securitySpecialFlags, Boolean checkExecutionPermission) at System.Reflection.Assembly._nLoad(AssemblyName fileName, String codeBase, Evidence assemblySecurity, Assembly locationHint, StackCrawlMark& stackMark, Boolean throwOnFileNotFound, Boolean forIntrospection) at System.Reflection.Assembly.InternalLoad(AssemblyName assemblyRef, Evidence assemblySecurity, StackCrawlMark& stackMark, Boolean forIntrospection) at System.Reflection.Assembly.InternalLoad(String assemblyString, Evidence assemblySecurity, StackCrawlMark& stackMark, Boolean forIntrospection) at System.Reflection.Assembly.Load(String assemblyString) at System.Web.Configuration.CompilationSection.LoadAssemblyHelper(String assemblyName, Boolean starDirective) The specific DLL that fails to load varies from incident to incident, but is always one referenced by the main assembly. We're running on ASP.NET 3.5 on Windows Server 2008. This seems to happen in batches affecting some but not all of sites on the same App Pool. We have a large number of sites all running the same code. Once a site has failed to load a DLL it throws up a Yellow Screen of Death until the next App Pool recycle. We haven't been able to reproduce this behavior and the sites seem to work fine for days or weeks at a time (and many App Pool recycles) before failing. Has anybody else seen similar behavior? Update: We've tried reproducing the failure by setting up a few hundred sites and writing a script to hit them repeatedly while recycling the App Pool once every couple of minutes and were unable to accomplish much other than loading down the server's CPU for a few days straight. We then tried messing (locking one of the DLLs, changing the file permissions) with the copies of the DLLs that ASP.NET makes and managed to reproduce similar behavior but not the same exception. Does anybody have any ideas on how to adjust the security policy to get it to throw a System.Security.Policy.PolicyException: Execution permission cannot be acquired. when loading a specific DLL?

    Read the article

  • Certificates Validations Issues

    - by user298331
    Hi All, i am facing some issues related certificates.i need some help to resolve these issues. Requirements : security mode="TransportWithMessageCredential" binding binding name="basicHttpEndpointBinding" certificateValidationMode ="ChainTrust" revocationMode="Online" Certificates : Service Cerificates : Transportlevel : XXXX.cer my cerificate name is my system DNS name and it is having root node i.e RootTrnCA.cer this is used to enable https.but am not validationg transport level certificates. Message Level : services.ca.iim (VXXXX.Cer--Act.Mac.Ca--services.ca.iim ) Client Cerificates : Transportlevel : ZZZZ.cer my cerificate name is my system DNS name and it is having root node i.e RootTrnCA.cer ignoring transport certificate errors through coading..... Message Level : client.ca.iim (VXXXX.Cer--Act.Mac.Ca--client.ca.iim ) Issues : 1) Response message is not contain Service certificate Signature in Soap header.so i am not able to validate Server certificate details in Client code. 2)if i use the transport with message credential and Chaintrust.i am getting error : The revocation function was unable to check revocation because the revocation server was offline.) so please very the below service and cleint config and correct me if i am wrong. Service config : Client config : i am attaching certificate through coading : objProxy.ChannelFactory.Credentials.ClientCertificate.SetCertificate(System.Security.Cryptography.X509Certificates. StoreLocation.LocalMachine, System.Security.Cryptography.X509Certificates. StoreName.My, X509FindType.FindBySubjectName, "client.ca.iim"); <binding name="XXXXXServiceHost.Http" closeTimeout="00:01:00" openTimeout="00:01:00" receiveTimeout="00:10:00" sendTimeout="00:01:00" allowCookies="false" bypassProxyOnLocal="false" hostNameComparisonMode="StrongWildcard" maxBufferSize="65536" maxBufferPoolSize="524288" maxReceivedMessageSize="65536" messageEncoding="Text" textEncoding="utf-8" transferMode="Buffered" useDefaultWebProxy="true"> <readerQuotas maxDepth="32" maxStringContentLength="8192" maxArrayLength="16384" maxBytesPerRead="4096" maxNameTableCharCount="16384" /> <security mode="TransportWithMessageCredential"> <transport clientCredentialType="None" proxyCredentialType="None" realm="" /> <message clientCredentialType="Certificate" algorithmSuite="Default" /> </security> </binding> </basicHttpBinding> </bindings> <client> <endpoint address="https://XXXXXX/XXXServiceHost/MemberSvc.svc/soap11" binding="basicHttpBinding" bindingConfiguration="XXXServiceHost.Http" contract="ServiceReference1.IMemberIBA" name="XXXServiceHost.Http" /> </client> </system.serviceModel>Please Verify both and Help me how to resolve above two issues . Thanks Babu

    Read the article

  • GlassFish Security Realm, Active Directory and Referral

    - by Allan Lykke Christensen
    I've setup up a Security Realm in Glassfish to authenticate against an Active Directory server. The configuration of the realm is as follows: Class Name: com.sun.enterprise.security.auth.realm.ldap.LDAPRealm JAAS context: ldapRealm Directory: ldap://172.16.76.10:389/ Base DN: dc=smallbusiness,dc=local search-filter: (&(objectClass=user)(sAMAccountName=%s)) group-search-filter: (&(objectClass=group)(member=%d)) search-bind-dn: cN=Administrator,CN=Users,dc=smallbusiness,dc=local search-bind-password: abcd1234! The realm is functional and I can log-in, but when ever I log in I get the following error in the log: SEC1106: Error during LDAP search with filter [(&(objectClass=group)(member=CN=Administrator,CN=Users,dc=smallbusiness,dc=local))]. SEC1000: Caught exception. javax.naming.PartialResultException: Unprocessed Continuation Reference(s); remaining name 'dc=smallbusiness,dc=local' at com.sun.jndi.ldap.LdapCtx.processReturnCode(LdapCtx.java:2820) .... .... ldaplm.searcherror While searching for a solution I found that it was recommended to add java.naming.referral=follow to the properties of the realm. However, after I add this it takes 20 minutes for GlassFish to authenticate against Active Directory. I suspect it is a DNS problem on the Active Directory server. The Active Directory server is a vanilla Windows Server 2003 setup in a Virtual Machine. Any help/recommendation is highly appreciated!

    Read the article

  • Does Security Trimming work with Web Forms Routing?

    - by Slauma
    In my web.config I have configured a SiteMapProvider with securityTrimmingEnabled="true" and on my main master page is an asp:Menu control bound to an asp:SiteMapDataSource. In addition I have configured restricted access to all pages in a subfolder "Admin" (using another web.config in this subfolder). If I put a sitemapNode in Web.sitemap... <siteMapNode url="~/Admin/Default.aspx" title="Administration" description="" > ... only users in role "Admin" will have the menu item related to that siteMapNode. So this is working fine and as intended. Now I have defined a URL route in Global.asax to map the physical file to a new URL: System.Web.Routing.RouteTable.Routes.MapPageRoute("AdminHomeRoute", "Administration/Home", "~/Admin/Default.aspx"); But when I use this route-URL in the SiteMap file... <siteMapNode url="Administration/Home" title="Administration" description="" > ... it seems that security trimming does not work: The menu item is visible for all users. (Access to the page is still restricted though, so selecting the menu item by non-Admin users does not navigate to the restricted page.) Question: Is there any setting I've missed so far to make security trimming working with URL routing in ASP.NET 4.0 Web Forms? Did I do something wrong? Is there any work-around? Thank you for help!

    Read the article

  • Applying fine-grained security to an existing application

    - by Mark
    I've inherited a reasonably large and complex ASP.NET MVC3 web application using EF Code First on SQL Server. It uses ASP.NET Membership roles with database authentication. The controller actions are secured with attributes derived from AuthorizeAttribute that map roles to actions. There are extension methods for the finer points, such as showing a particular widget to particular roles. This is works great and I have a good understanding of the current security model. I've been asked to provide finer grained security at the data level. For example a 'Customer' user can only see data (throughout the database) associated with themselves and not other Customers. The problem is that 'Customer' is only 1 of 5 different types with their own specific restrictions (each of the 9 roles is one of these 5 types). The best thing I can think of is to go through all the data repositories and extend each and every LINQ statements/query with a filter for every user type. Even if I had time for that it doesn't seem like the most elegant way. Any suggestions? I really don't know where to start with this so anything could be helpful. Many thanks.

    Read the article

  • mscomctl.ocx on my dev machine gives me problems since security patch of MS

    - by Bronzato
    I am busy on this problem since 2 days ago and hope someone can get me out of it. I have Excel 2010 (full install 944Mb) on my Windows 8 computer. It works well. But when I modify my workbook (containing a ListView version 6.0) I am not able to run it on my client's computer. I get the error: Could Not Load An Object. Not Available on This Machine. Even it works well on my dev machine. The reason is: Microsoft applied a Security Patch (near august 2012 I think) on mscomctl.ocx and my Excel 2010 installation files (downloaded not long ago) contains the new version of mscomctl.ocx. The clients using my Excel file don't apply the security patch at this moment. Se everytime I publish my Excel file to client's computers (from my dev environment), I reference the new mscomctl.ocx. That's the problem. I already try to get the old mscomctl.ocx from client's computer and copy & register it on my dev machine but then I got errors (Class not registered, ...) when I create a userform and drag a listview on it. So: mscomctl.ocx on client's machine is version 6.1.98.13 from 2008. mscomctl.ocx on my dev machine is version 6.1.98.34 from 2012. My question: Does someone have an idea how to proceed to have a usable version of mscomctl.ocx on my dev machine? Thank you very much.

    Read the article

  • permission denied: /etc/apt/sources.list

    - by Eli
    I'm trying to install java jre, i usually do it like this sudo echo 'deb http://www.duinsoft.nl/pkg debs all' >> /etc/apt/sources.list sudo apt-key adv --keyserver keys.gnupg.net --recv-keys 5CB26B26 sudo apt-get update sudo apt-get install update-sun-jre exit but when i do sudo echo 'deb http://www.duinsoft.nl/pkg debs all' >> /etc/apt/sources.list i see permission denied: /etc/apt/sources.list When i do ls -l /etc/apt/sources.list i see -rw-r--r-- 1 root root 3360 Aug 26 01:45 /etc/apt/sources.list When i do sudo mv /etc/apt/sources.list /etc/apt/sources.list.old sudo cat /etc/apt/sources.list.old | sudo tee /etc/apt/sources.list i see #deb cdrom:[Ubuntu 12.04 LTS _Precise Pangolin_ - Release amd64 (20120425)]/ dists/precise/main/binary-i386/ #deb cdrom:[Ubuntu 12.04 LTS _Precise Pangolin_ - Release amd64 (20120425)]/ dists/precise/restricted/binary-i386/ #deb cdrom:[Ubuntu 12.04 LTS _Precise Pangolin_ - Release amd64 (20120425)]/ precise main restricted # See http://help.ubuntu.com/community/UpgradeNotes for how to upgrade to # newer versions of the distribution. deb http://lb.archive.ubuntu.com/ubuntu/ precise main restricted deb-src http://lb.archive.ubuntu.com/ubuntu/ precise main restricted ## Major bug fix updates produced after the final release of the ## distribution. deb http://lb.archive.ubuntu.com/ubuntu/ precise-updates main restricted deb-src http://lb.archive.ubuntu.com/ubuntu/ precise-updates main restricted ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team. Also, please note that software in universe WILL NOT receive any ## review or updates from the Ubuntu security team. deb http://lb.archive.ubuntu.com/ubuntu/ precise universe deb-src http://lb.archive.ubuntu.com/ubuntu/ precise universe deb http://lb.archive.ubuntu.com/ubuntu/ precise-updates universe deb-src http://lb.archive.ubuntu.com/ubuntu/ precise-updates universe ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team, and may not be under a free licence. Please satisfy yourself as to ## your rights to use the software. Also, please note that software in ## multiverse WILL NOT receive any review or updates from the Ubuntu ## security team. deb http://lb.archive.ubuntu.com/ubuntu/ precise multiverse deb-src http://lb.archive.ubuntu.com/ubuntu/ precise multiverse deb http://lb.archive.ubuntu.com/ubuntu/ precise-updates multiverse deb-src http://lb.archive.ubuntu.com/ubuntu/ precise-updates multiverse ## N.B. software from this repository may not have been tested as ## extensively as that contained in the main release, although it includes ## newer versions of some applications which may provide useful features. ## Also, please note that software in backports WILL NOT receive any review ## or updates from the Ubuntu security team. deb http://lb.archive.ubuntu.com/ubuntu/ precise-backports main restricted universe multiverse deb-src http://lb.archive.ubuntu.com/ubuntu/ precise-backports main restricted universe multiverse deb http://security.ubuntu.com/ubuntu precise-security main restricted deb-src http://security.ubuntu.com/ubuntu precise-security main restricted deb http://security.ubuntu.com/ubuntu precise-security universe deb-src http://security.ubuntu.com/ubuntu precise-security universe deb http://security.ubuntu.com/ubuntu precise-security multiverse deb-src http://security.ubuntu.com/ubuntu precise-security multiverse ## Uncomment the following two lines to add software from Canonical's ## 'partner' repository. ## This software is not part of Ubuntu, but is offered by Canonical and the ## respective vendors as a service to Ubuntu users. # deb http://archive.canonical.com/ubuntu precise partner # deb-src http://archive.canonical.com/ubuntu precise partner ## This software is not part of Ubuntu, but is offered by third-party ## developers who want to ship their latest software. deb http://extras.ubuntu.com/ubuntu precise main deb-src http://extras.ubuntu.com/ubuntu precise main and the issue is not solved, i still see that permission error, I'm on a 64 bit laptop

    Read the article

  • apt-get 403 Forbidden

    - by Lerp
    I've start a new job today and I am trying to set up my machine to run through their Windows server. I've managed to get a internet connection through the server now but now I can't run apt-get update as I get a "403 Forbidden" error. This is for every repo under my source list, apart from translations(?). I do have a proxy in apt.conf, if I don't have it I get a 407 Permission Denied error. Here's my apt.conf file (I have omitted my username and password) Acquire::http::proxy "http://username:[email protected]:8080/"; Here's my sources.list #deb cdrom:[Ubuntu 12.04.2 LTS _Precise Pangolin_ - Release amd64 (20130213)]/ dists/precise/main/binary-i386/ #deb cdrom:[Ubuntu 12.04.2 LTS _Precise Pangolin_ - Release amd64 (20130213)]/ dists/precise/restricted/binary-i386/ #deb cdrom:[Ubuntu 12.04.2 LTS _Precise Pangolin_ - Release amd64 (20130213)]/ precise main restricted # See http://help.ubuntu.com/community/UpgradeNotes for how to upgrade to # newer versions of the distribution. deb http://gb.archive.ubuntu.com/ubuntu/ precise main restricted deb-src http://gb.archive.ubuntu.com/ubuntu/ precise main restricted ## Major bug fix updates produced after the final release of the ## distribution. deb http://gb.archive.ubuntu.com/ubuntu/ precise-updates main restricted deb-src http://gb.archive.ubuntu.com/ubuntu/ precise-updates main restricted ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team. Also, please note that software in universe WILL NOT receive any ## review or updates from the Ubuntu security team. deb http://gb.archive.ubuntu.com/ubuntu/ precise universe deb-src http://gb.archive.ubuntu.com/ubuntu/ precise universe deb http://gb.archive.ubuntu.com/ubuntu/ precise-updates universe deb-src http://gb.archive.ubuntu.com/ubuntu/ precise-updates universe ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team, and may not be under a free licence. Please satisfy yourself as to ## your rights to use the software. Also, please note that software in ## multiverse WILL NOT receive any review or updates from the Ubuntu ## security team. deb http://gb.archive.ubuntu.com/ubuntu/ precise multiverse deb-src http://gb.archive.ubuntu.com/ubuntu/ precise multiverse deb http://gb.archive.ubuntu.com/ubuntu/ precise-updates multiverse deb-src http://gb.archive.ubuntu.com/ubuntu/ precise-updates multiverse ## N.B. software from this repository may not have been tested as ## extensively as that contained in the main release, although it includes ## newer versions of some applications which may provide useful features. ## Also, please note that software in backports WILL NOT receive any review ## or updates from the Ubuntu security team. deb http://gb.archive.ubuntu.com/ubuntu/ precise-backports main restricted universe multiverse deb-src http://gb.archive.ubuntu.com/ubuntu/ precise-backports main restricted universe multiverse deb http://security.ubuntu.com/ubuntu precise-security main restricted deb-src http://security.ubuntu.com/ubuntu precise-security main restricted deb http://security.ubuntu.com/ubuntu precise-security universe deb-src http://security.ubuntu.com/ubuntu precise-security universe deb http://security.ubuntu.com/ubuntu precise-security multiverse deb-src http://security.ubuntu.com/ubuntu precise-security multiverse ## Uncomment the following two lines to add software from Canonical's ## 'partner' repository. ## This software is not part of Ubuntu, but is offered by Canonical and the ## respective vendors as a service to Ubuntu users. # deb http://archive.canonical.com/ubuntu precise partner # deb-src http://archive.canonical.com/ubuntu precise partner ## This software is not part of Ubuntu, but is offered by third-party ## developers who want to ship their latest software. deb http://extras.ubuntu.com/ubuntu precise main deb-src http://extras.ubuntu.com/ubuntu precise main I can sort-of fix this by changing all the http in sources.list to ftp but I still have issues with ppas

    Read the article

  • Help finding missing mumble-server dependencies

    - by Otoris
    I'm trying to install the mumble-server package using apt-get install mumble-server on Ubuntu 11.10 Server Edition on Rackspace Cloud. Problem is it can't find dependencies it should have found because they exist on launchpad.net? Dependencies message: Reading package lists... Done Building dependency tree Reading state information... Done Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following information may help to resolve the situation: The following packages have unmet dependencies: mumble-server : Depends: libavahi-compat-libdnssd1 (>= 0.6.16) but it is not installable Depends: libprotobuf7 but it is not installable Depends: libqt4-dbus (>= 4:4.5.3) but it is not installable Depends: libqt4-network (>= 4:4.5.3) but it is not installable Depends: libqt4-sql (>= 4:4.5.3) but it is not installable Depends: libqt4-xml (>= 4:4.5.3) but it is not installable Depends: libqtcore4 (>= 4:4.7.0~beta1) but it is not installable Depends: libqt4-sql-sqlite but it is not installable E: Unable to correct problems, you have held broken packages. Any ideas on if I might be missing sources? I've been googling around and haven't found anyone else in this situation or anyone else not able to install the aforementioned packages. Thanks for your time! sources.list: deb http://mirror.rackspace.com/ubuntu/ oneiric restricted deb-src http://mirror.rackspace.com/ubuntu/ oneiric restricted ## Major bug fix updates produced after the final release of the ## distribution. deb http://mirror.rackspace.com/ubuntu/ oneiric-updates restricted deb-src http://mirror.rackspace.com/ubuntu/ oneiric-updates restricted ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team. Also, please note that software in universe WILL NOT receive any ## review or updates from the Ubuntu security team. deb http://mirror.rackspace.com/ubuntu/ oneiric universe deb-src http://mirror.rackspace.com/ubuntu/ oneiric universe deb http://mirror.rackspace.com/ubuntu/ oneiric-updates universe deb-src http://mirror.rackspace.com/ubuntu/ oneiric-updates universe ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team, and may not be under a free licence. Please satisfy yourself as to ## your rights to use the software. Also, please note that software in ## multiverse WILL NOT receive any review or updates from the Ubuntu ## security team. deb http://mirror.rackspace.com/ubuntu/ oneiric multiverse deb-src http://mirror.rackspace.com/ubuntu/ oneiric multiverse deb http://mirror.rackspace.com/ubuntu/ oneiric-updates multiverse deb-src http://mirror.rackspace.com/ubuntu/ oneiric-updates multiverse ## Uncomment the following two lines to add software from the 'backports' ## repository. ## N.B. software from this repository may not have been tested as ## extensively as that contained in the release, although it includes ## newer versions of some applications which may provide useful features. ## Also, please note that software in backports WILL NOT receive any review ## or updates from the Ubuntu security team. # deb http://us.archive.ubuntu.com/ubuntu/ oneiric-backports restricted universe multiverse # deb-src http://us.archive.ubuntu.com/ubuntu/ oneiric-backports restricted universe multiverse ## Uncomment the following two lines to add software from Canonical's ## 'partner' repository. This software is not part of Ubuntu, but is ## offered by Canonical and the respective vendors as a service to Ubuntu ## users. # deb http://archive.canonical.com/ubuntu oneiric partner # deb-src http://archive.canonical.com/ubuntu oneiric partner deb http://security.ubuntu.com/ubuntu oneiric-security restricted deb-src http://security.ubuntu.com/ubuntu oneiric-security restricted deb http://security.ubuntu.com/ubuntu oneiric-security universe deb-src http://security.ubuntu.com/ubuntu oneiric-security universe deb http://security.ubuntu.com/ubuntu oneiric-security multiverse deb-src http://security.ubuntu.com/ubuntu oneiric-security multiverse # Cool Kid Webmin/Usermin Here Brah deb http://download.webmin.com/download/repository sarge contrib deb http://webmin.mirror.somersettechsolutions.co.uk/repository sarge contrib

    Read the article

  • Disable Google Chrome warning if security certificate is not trusted

    - by sippa
    Hi, I want to know if it's possible to disable the warning you get in Chrome when you try to go to some HTTPS site that doesn't have a trusted certificate. I have a few sites in my bookmarks that use HTTPS but none of them have trusted certificates, so each time I visit them I manually have to click "Proceed anyway" in the warning and it's getting kind of annoying. Is there any way to disable the warning or somehow add these sites to some kind of safe list? Thanks

    Read the article

  • Strict security and virtual host isolation with Nginx?

    - by Hach-Que
    I currently have an Apache web server set up under which each virtual host is isolated using HTTPD-ITK and the AppArmor module. Each virtual host's workers are setuid/setgid by the server and are then placed in an AppArmor profile. I'm looking to use Nginx but I can't find any documentation on setting it up so that rather than the worker processes being shared between all virtual hosts, worker processes are per virtual host (and thus can be setuid / setgid). Is there any way to do this under Nginx?

    Read the article

  • Does pointing *.[int].mydomain.com to 192.168.1.[int] constitute a security threat

    - by Dave
    For testing purposes, I've found it's really useful to point whatever.machineIP.mydomain.com to 192.168.1.machineIP : that way we can test each other's code without fidgetting with hosts files. I'm aware that this identifies our local IP addresses to the outside world, but if someone could access the network, it'd be trivial to sniff which of the local IP addresses respond to port 80 anyway. Is there anything I'm not seeing? Credit for the idea: http://news.ycombinator.com/item?id=1168896

    Read the article

  • RemoteApp Security Warning

    - by nairware
    I have a Windows 2012 Standard x64 RemoteApps RDWeb portal where I can launch apps. We have one remote app in particular which is RDP (mstsc.exe). Whenever a user launches it, they receive three different prompts--the second one is this alert (shown below). How can I get rid of this alert? I have other RemoteApps launching as well, and they do not throw errors or alerts like this one. And they are applications with the .exe extension, so I do not understand what is so unique about the RDP RemoteApp that would cause this alert. One thing perhaps worth mentioning is this particular RDP remote app points directly to the mstsc.exe executable residing on a particular session host/terminal server (as shown in the "From" value of the warning). As such, a gateway server would not be used to load-balance and choose the RDP client launched from a session host at random. This RDP RemoteApp is explicitly associated with one particular terminal server.

    Read the article

  • vagrant and puppet security for ssl certificates

    - by Sirex
    I'm pretty new to vagrant, would someone who knows more about it (and puppet) be able to explain how vagrant deals with the ssl certs needed when making vagrant testing machines that are processing the same node definition as the real production machines ? I run puppet in master / client mode, and I wish to spin up a vagrant version of my puppet production nodes, primarily to test new puppet code against. If my production machine is, say, sql.domain.com I spin up a vagrant machine of, say, sql.vagrant.domain.com. In the vagrant file I then use the puppet_server provisioner, and give a puppet.puppet_node entry of “sql.domain.com” to it gets the same puppet node definition. On the puppet server I use a regex of something like /*.sql.domain.com/ on that node entry so that both the vagrant machine and the real one get that node entry on the puppet server. Finally, I enable auto-signing for *.vagrant.domain.com in puppet's autosign.conf, so the vagrant machine gets signed. So far, so good... However: If one machine on my network gets rooted, say, unimportant.domain.com, what's to stop the attacker changing the hostname on that machine to sql.vagrant.domain.com, deleting the old puppet ssl cert off of it and then re-run puppet with a given node name of sql.domain.com ? The new ssl cert would be autosigned by puppet, match the node name regex, and then this hacked node would get all the juicy information intended for the sql machine ?! One solution I can think of is to avoid autosigning, and put the known puppet ssl cert for the real production machine into the vagrant shared directory, and then have a vagrant ssh job move it into place. The downside of this is I end up with all my ssl certs for each production machine sitting in one git repo (my vagrant repo) and thereby on each developer's machine – which may or may not be an issue, but it dosen't sound like the right way of doing this. tl;dr: How do other people deal with vagrant & puppet ssl certificates for development or testing clones of production machines ?

    Read the article

  • Howto: SaaS / PHP Application / Tenants / Security

    - by Ben Fransen
    Hi all, Being completely new in the webhostingcorner I have a few questions on how to implement/setup a webserver for a SaaS application. I'm about to rent my own server for a new product (CMS) I'm launching in two months. Developing the system wasn't that much of wild ride to me, but a correct way to implement it, is. So lets say this is my situation: I want to host 10 websites for 8 clients. There are 6 single sites, and two clients have two websites they can manage with my software. The CMS must be placed on the server too, all clients are connecting to 1 system The database must be placed Depending on the contract a client makes, the client gets some storage. How to measure the used storage over the DB, FileSystem and email Clients may not, in any case be able to somehow get outside their directory, but from the CMS directory the CMS must be able to create files and dirs in a clients directory (for templates, imagegalleries, widgets, etc, etc). I was thinking about something like a dirstructure like this: ./CMS/ [all CMS files] ./Websites/*/ [all websites] My hostingprovider will install updates to the os (CentOS, latest) and the admin panel (Direct Admin). Is there anybody with experience on this topic? Or do you have some thoughts about it? please join the conversation since I'm completely new to this. Ben

    Read the article

  • Understanding Security Certificates (and thier pricing)

    - by John Robertson
    I work at a very small company so certificate costs need to be absolutely minimal. However for some applications we do Need to have our customers get that warm fuzzy not-using-a-self-signed certificate feeling. Since creating a "certificate authority" with makecert really just means creating a public/private key pair, it seems pretty clear that creating a public/private key pair FROM such a "certificate authority" really just means generating a second public/private key pair and signing both with the private key that belongs to the "certificate authority". Since the keys are signed anyone can verify they came from the certificate authority I created, or if verisign gave me the pair they sign it with one of their own private keys, and anyone can use verisigns corresponding public key to confirm verisign as the source of the keys. Given this I don't understand when I go to verisign or godaddy why they have rates only for yearly plans, when all I really want from them is a single public/private key pair signed with one of their private keys (so that anyone else can use their public keys to confirm that, yes, they gave me that public/private key pair and they confirmed I was who I said I was so you can trust my public/private key pair as belonging to a legitimate third party). Clearly I am misunderstanding something, what is it? Does verisign retire their public/private key pairs periodically so that my verisign signed key pair "expires" and I need new ones? Edit: I learned that the certificate has an internal expiration date and it also maintains an internal value stating whether it can be used to sign other certificates (i.e. sign other private/public key pairs stored as certificates). Can't I get a few (even one) non-signing certificate signed by someone like verisign that I can use for authentication/encryption without a yearly subscription?

    Read the article

  • SOHO Netflix and network security

    - by TW
    I want to use WIFI for HiDef video, but I don't trust it for my office PC's. I've heard of VLANs but I have no idea how to set it up or what (SOHO) hardware to buy. Other than getting 2 different DSL lines, how can I be absolutely sure that the PC side doesn't get hacked? What if I want to use MS Home server as a backup device for both sides? Can I make it "read only" for the PC side, and physically change the cable if I need to restore? TW

    Read the article

  • Networked filesystem with user level security for linux

    - by Konrads
    Hi, I want to enable file sharing between servers and clients, both linux. I don't want to rely on machine trust like in NFSv4 because client users will have root privileges. What are my options besides SMB (SAMBA)? Does OpenAFS support user level authentication & access? Using mounted WebDAV/ftp/sshfs seems silly for LAN.

    Read the article

  • Jungledisk file transfer security

    - by JC
    Does JungleDisk use https for file transfers? If so, does this mean a 3rd party cannot intercept content or even file names of files being backed up? (assume JungleDisks encrypt option is not being used)

    Read the article

  • Security: Managing network shares remotely on Ubuntu?

    - by Industrial
    Hi everyone, I am about to setup a home network server running Ubuntu Server and I'm currently a bit worried about how to handle network shares and permissions in a good way. After working a bit lately with Netgears ReadyNAS's units, I have become really spoiled with how easy it was to set up network shares and giving a specific user different levels of network access to a specific share (forbidden access, read, read/write). How would I accomplish the same with my Ubuntu server through SSH? Thanks a lot

    Read the article

  • VS 2010 Security Warning When Opening My Own Projects

    - by Zian Choy
    Whenever I try to open my own projects in VS 2010 Express, I get the following message: You should only open projects from a trustworthy source I can click OK on the message and open the solution, but I would prefer to not get warned every time I open my solution. The files were not downloaded from the Internet; they are sitting right on my department's network drive. There's nothing to unblock if I look at the Properties window for the project file. Any tips for squashing this bug will be appreciated.

    Read the article

  • Outlook 2010: Cached Exchange Mode, File Storage and Security

    - by dangowans
    I'm in an environment where profile space is a premium, and most users have "frozen" machines, meaning that on restart, the C: drive is returned to its original state. Cached Exchange Mode sounds interesting to me, but I'm wondering if we can take advantage of it without causing other issues. Where in the file system does the cached data get stored? Is it in the profile? A temp folder? Is the cached file secured in some way to keep others from seeing it?

    Read the article

< Previous Page | 91 92 93 94 95 96 97 98 99 100 101 102  | Next Page >