Search Results

Search found 40159 results on 1607 pages for 'multiple users'.

Page 550/1607 | < Previous Page | 546 547 548 549 550 551 552 553 554 555 556 557  | Next Page >

  • External drives show up in Nautilus/Computer even when they are unplugged.

    - by Testament
    I have two 1TB Seagate USB (sdc1 and sdd1) drives connected to an old PC without an X server running. Since sdc1 and sdd1 change depending on the order in which they are plugged in, I decided to mount them using their UUID instead. These are my fstab entries UUID=d1b28578-451b-4f03-af28-2e8a6d5b7efb /media/Seagate ext3 defaults,rw,auto,users UUID=36bf5df4-934e-42d4-9e25-16a13971509c /media/Projects ext3 defaults,rw,auto,users They work fine, but when I unmount them and unplug the USB drives, they still show up in Nautilus (I'm running nautilus with X11 forwarding onto another Ubuntu machine, btw). Now if I remove those entries from fstab, the drives disappear from Computer. If I add the entries back, they show up as an unmounted drive even when the drive is not plugged in. How do I do this so they don't show up when they're not plugged in?

    Read the article

  • How to restrict deletion of a folder on NTFS share, but still allow modify access within folder

    - by thinkdreams
    I am setting up a set of scan folders from a scanning copier device, and would like to know the best way to protect the folders (for each department) from moving or deletion, but yet still allow access for the users to modify (i.e. create/add/delete) the scanned files within the folder. Structure is: Share Name Departmental Folder User files The writing of the files initially is taken care of by a service account which has full control. We'd just like to ensure the users cannot accidentally delete the folder (which has already happened) containing all the files, etc. This is for a Windows 2003 server, NTFS permissions. Suggestions would be most appreciated.

    Read the article

  • Windows 8 Remote Desktop only allows one user at a time?

    - by segmentation fault
    I tried connecting to Windows 8 using its built-in Remote Desktop feature, but for some inexplicable reason, it requires that no users are logged in on the target machine before a remote user can log in. This has never been a problem with rdesktop on Unixen; I could rdesktop from as many machines as I wanted and any logged-in users would never notice a thing. What's the problem with Windows? Any way to allow concurrent local and remote logins to a Windows 8 machine without hacks or cracks? The "guides" on how to do this that show up in the Google results all suggest replacing a system DLL with a hacked one, but that's not acceptable.

    Read the article

  • Access Control issue

    - by user160605
    Ok this is stumping me mainly because of the lack of experience I have with access control. I have two folders I need to keep away from users. Payroll and Banking. I went into security and took away all the users. I made a new group called access granted and added it to both folders. I then gave full control to the group. I then added a few days to this group. I tested with partial success. I can only get into some folders and subfolders/files. I made sure I clicked on the option for all subfolders. This is my layout C:(folder) -- permissions granted to admin,access (full control) when I look at the problem files/folders no one has any permissions I don't even see the group or admin. what am I doing wrong. Thanks

    Read the article

  • Is it safe to set MySQL isolation to "Read Uncommitted" (dirty reads) for typical Web usage? Even with replication?

    - by Continuation
    I'm working on a website with typical CRUD web usage pattern: similar to blogs or forums where users create/update contents and other users read the content. Seems like it's OK to set the database's isolation level to "Read Uncommitted" (dirty reads) in this case. My understanding of the general drawback of "Read Uncommitted" is that a reader may read uncommitted data that will later be rollbacked. In a CRUD blog/forum usage pattern, will there ever be any rollback? And even if there is, is there any major problem with reading uncommitted data? Right now I'm not using any replication, but in the future if I want to use replication (row-based, not statement-based) will a "Read Uncommitted" isolation level prevent me from doing so? What do you think? Has anyone tried using "Read Uncommitted" on their RDBMS?

    Read the article

  • WiFi problems on several Ubuntu installations

    - by Rickyfresh
    Okay this is the first time I have ever had to ask a question as usually the Ubuntu community have answered everything already but on this occasion there are many people asking for the answer but not one good solution has become available so far so someone please help or I will have to install Windows on my sons and my girlfriends PCs and that would be a disaster as I am trying to help convince people to move from Windows. I installed 12.04 on three computers on the same day. Dell Inspiron (Works Perfect) Toshiba Satellite Home built Desktop The Dell works perfect but the other two either keep losing connection to the wireless Internet and even when they are connected they stop connecting to web sites, for some reason it searches Google fine but will not connect to web sites when a link is clicked. So far people have recommended in other forums: Removing network manager and installing wicd (didn't solve it) Changing the MTU in the wireless settings (didn't solve it) All sorts of messing about with Firefox settings (this doesn't solve it and even if it did this would leave most average PC users scratching their heads and wishing they had stuck to windows) The problem exists on two very different machines and different wireless cards so I doubt its a driver or hardware issue, also many other Ubuntu users are having the same problem with a vast array of different machines and wireless cards. Can someone please give a good solution to this as its going to turn a lot of people away from Ubuntu if they cannot get this sorted. I would give some PC specs but the two machines are vastly different and the other people complaining of this problem also have very different systems all showing the same problem.

    Read the article

  • When one DC crashes, TFS 2012 stops working

    - by blizz
    We have two Windows 2008 domain controllers. We installed the second DC only a few months ago. We also have a TFS 2012 server on the network. Today, when the older DC crashed, TFS stopped working completely. Local users received messages such as "You are not authorized to access ServerName\Collection". Remote users received messages such as "The server was used in your last session, but it might be offline or unreachable". So my question is, why did TFS not use the second, newer DC instead of just crashing along with the first DC?

    Read the article

  • How Do I Parse a String?

    - by Russ
    I am new to bash, and I am creating a script that loops through the files in a directory and based on part of the filename, does something with the file, so far I have this: #!/bin/bash DIR="/Users/me/Documents/import/*" for f in "$DIR" do $t=?????? echo "Loading $f int $t..." done so $f will output something like this: /Users/me/Documents/import/time_dim-1272037430173 out of this, I want time_dim, the directory can be variable length and -1272037430173 is a fixed length (it's the unix timestamp btw). What is the best way to go about this?

    Read the article

  • WSS 3.0 fails to hide quick launch items for which the current user does not have access

    - by Nils
    I'm running a Small Business Server 2008 with Windows Sharepoint Services 3.0 (WSS 3.0). I thought WSS was supposed to hide menu items for which the current logged in user don't have access? Apparently, all users can see all links, regardless of whether they have access. This applies to both links to newly created sub-sites as well as document libraries/lists. Is this expected behaviour, or is there a misconfiguration somewhere that causes the links to stay visible even for users without access? Thanks!

    Read the article

  • Group Policy for Setting Passwords: Server 2003 Domain

    - by user1236435
    In my 2003 domain, I am being requested to set a password policy to require passwords to expire every 4 months, and also require users to change their password on their next login, due to a security issue. In my domain, my OU's are setup by location, then drilled down to city, then the users and computers are in separate sub-domains. My question is, how do I set this up for my domain? Will I need to set the policy up for loop back? Can I configure this for just a specific OU? Any suggestions on how to move forward? Any advise is much appreciated, and thanks in advance!

    Read the article

  • Need some critique on .NET/WCF SOA architecture plan

    - by user998101
    I am working on a refactoring of some services and would appreciate some critique on my general approach. I am working with three back-end data systems and need to expose an authenticated front-end API over http binding, JSON, and REST for internal apps as well as 3rd party integration. I've got a rough idea below that's a hybrid of what I have and where I intend to wind up. I intend to build guidance extensions to support this architecture so that devs can build this out quickly. Here's the current idea for our structure: Front-end WCF routing service (spread across multiple IIS servers via hardware load balancer) Load balancing of services behind routing is handled within routing service, probably round-robin One of the services will be a token Multiple bindings per-service exposed to address JSON, REST, and whatever else comes up later All in/out is handled via POCO DTOs Use unity to scan for what services are available and expose them The front-end services behind the routing service do nothing more than expose the API and do conversion of DTO<-Entity Unity inject service implementation to allow mocking automapper for DTO/Entity conversion Invoke WF services where response required immediately Queue to ESB for async WF -- ESB will invoke WF later Business logic WF layer Expose same api as front-end services Implement business logic Wrap transaction context where needed Call out to composite/atomic services Composite/Atomic Services Exposed as WCF One service per back-end system Standard atomic CRUD operations plus composite operations Supports transaction context The questions I have are: Are the separation of concerns outlined above beneficial? Current thought is each layer below is its own project, except the backend stuff, where each system gets one project. The project has a servicehost and all the services are under a services folder. Interfaces live in a separate project at each layer. DTO and Entities are in two separate projects under a shared folder. I am currently planning to build dedicated services for shared functionality such as logging and overload things like tracelistener to call those services. Is this a valid approach? Any other suggestions/comments?

    Read the article

  • Win 2003 Junction Point to Remote Unix Share

    - by Pogrindis
    Env : Windows Server 2003 with already established shared folders over the local Domain via Windows DC and AD. - Linux box being used as a fileserver with the folder /files/share being R+W by all domain users, this is not a problem. I have already transfered the files from the Windows Box to the /files/share on the Linux Box however i now want to create a junction point in order to prevent users saving to the Windows box. I have tried the FileServer Administration on windows server 2003 however it will not allow me to junction remote servers. I have tried mounting the remote filesystem as a drive and proceeding that way however no joy. Anyone have any suggestions ?

    Read the article

  • Graphing per-user CPU usage on a Linux machine

    - by mart1n
    I want to graph (graphical output would be great, i.e. a .png file) the following situation: I have users A, B, and C. I limit their resources so that when all users run a CPU intensive task at the same time, those processes will use 25%, 25%, and 50% of CPU. I know I can get the real-time stats using top but have no idea what to do with them. I've searched through the huge top man page but haven't found much on the subject of outputting data that can be graphed. Ideally, the graph would show a span of maybe 30 seconds. Any ideas how to achieve this?

    Read the article

  • Is it possible to track down who or what changed a shared permission?

    - by user45574
    Today I received an email from one of my users asking why he couldn’t access his shared folder on one of our servers. Example: \\servername\share\ = access denied. When I checked the share permissions on the folder I was surprised to see that the user had been removed from the "shared permissions" list. Now my question is: Is it possible to track who or what deleted the users share permissions on the folder? I have studied the different event logs, but couldn’t find any indication of anyone who had changed the share permissions. Kind Regards Martin

    Read the article

  • Dynamic authentication realms in Apache

    - by Cogsy
    I have a front end server acting as a gateway proxy for many (a dynamic 'many') building monitors with embedded webservers. They are accessed with a URL like: http://www.example.com/monitor1/ http://www.example.com/monitor2/ ... I'm trying to restrict access to these monitors to only the users that own them. So what I need is a way of specifying rights to users or groups for specific directories. The standard auth mechanisms I see in Apache won't work because I need to specify every location. I'd prefer some dynamic map or script. Any suggestions?

    Read the article

  • Default profile for large

    - by user63434
    Hi I am setting up a master image to clone to all same machine type Windows 7 client, I login as administrastor and installed all the programs and changed the desktop settings etc, but my local administrator profile is 244megs in size, which will become the default profile of the local machine when sysprep, we have a 2003 server that I want to use mandatory profile for all login users which means I need to copy this profile to the server so when any users login to the domain they are using this profile, loading a 244megs profile is going to be very slow since it will be removed from the client when they logoff. So next time they login it will take a long time again. Is there anything I can do, can I just copy just the bare minimum files from the default profile to the server, as I am not sure what parts I need, I read that I must copy my documents, my documents/pictures so the folder redirection will work. What else do I need to copy to the server? I have firefox xmark sync also and MS words etc. THanks

    Read the article

  • Disable address bar in Internet Explorer 9

    - by token
    I'm trying to disable the address bar in IE9. I've done a significant amount of searching on this and just can't seem to find a way to make it happen. A lot of web resources discuss how to do it in IE8, but not IE9. The reason you might ask? I have an application being hosted in a remote desktop farm that links to web pages outside of the application into Internet Explorer. I need to ensure users are limited to just going to the pages the program pushes them to. I realize I could use a proxy server to limit where they can go, but I'm trying to find a really simple way to just disable the address bar instead. I can't use Kiosk mode because it puts the browser into full screen mode. This won't work for my situation as I need to give users what appears to be a regular browsing experience without an address bar.

    Read the article

  • Can you recommend a game server for a facebook board game?

    - by Yekmer Simsek
    I am seeking a game server that will scale well. All commercial and/or free software alternatives are welcome. Game will be a boardgame that is similar to poker. Some technical details are listed below. There will be a table which consists of 4 people, to send them message I need a channel manager. A table will be ready to play for at least 5 minutes. There should be a reliable channel manager. People will wait for some time(i.e.) and if they are not playing they will be kicked by server, so there will be a reliable timed task queue to execute some tasks. It should be quick enough to response and show the changes to all 4 people on that table simultaneously.To achive this server should have a powerfull I/O library. I think to use inmemory to have quick response times, but it comes with scalability problems. And some variables should be thread safe so a variable should be thread safe between multiple nodes. Flash(AS3) and Unity (.Net 2.0 C# mono) client API's should be available for socket connection. PS: I am using Reddwarf server, it lacks of documentation and multiple node.

    Read the article

  • Postfix - How to process incoming emails?

    - by Borivojevic
    Hello Does anybody know how to process incoming emails for virtual mailboxes in postfix? I am building web application where users add new content by sending emails to application. Email address used for each user is custom (eg. [email protected]) and it is dynamically created as a Postfix virtual mailbox. User needs to be able to send email to his custom mailbox address ([email protected]) and i want to process each incoming email, parse it's contents and populate my database with data from email. I tried using Postfix After Queue filter but what i really wont is to process emails once they are saved in users virtual mailbox folder.

    Read the article

  • What can cause a black or blank screen when pressing logout or switch user in windows 7

    - by Medran
    The situation I have here is related to a brand new dell machine with a GTX260 video card. Put simply after one user logs in when that user either 'switch users' or 'logs out' windows goes to a black/blank screen. The TV screen that is used with this computer previously functioned fine with fast user switching on an XP machine. The new computer is windows 7. The TV is not outputting any error messages like 'no signal' or anything else that would be displayed if the computer was off. You can fix the problem by cycling the input on television, after cycling the input the welcome screen appears as normal. What I want to know is what on earth is windows doing that would cause the video card to stop sending the same signal to the monitor when the user logs off or switches users. I mean as far as I can tell the resolution is identical between windows and the welcome screen. Or perhaps if somebody has experienced this before a fix would be great too.

    Read the article

  • What TLDs should I use for my NS records for redundancy? (DNSSEC support required)

    - by makerofthings7
    Question As a general practice, is it a good idea to use multiple TLDs for the name servers? How should I choose between which TLD would be a good candidate for being the root server for my NS name? More Info I am switching over 800 DNS zones to an outsourced DNS provider. I originally planned on setting the zone names to nsX.company.com, but think it would be best to have multiple TLDs such as .net , .org and .info Since I plan on supporting DNSSec at company.com I think all the 1st tier Name servers must support it as well. Part of the inspiration for this question came from our provider UltraDNS. In their configuration screen for our domains, they actively verify and alert us if our name servers aren't exactly: pdns1.ultradns.net pdns2.ultradns.net pdns3.ultradns.org pdns4.ultradns.org pdns5.ultradna.info pdns6.ultradns.co.uk

    Read the article

  • MVC 4 Authentication

    - by Aligned
    First: After searching for awhile to figure out what’s new/different with MVC 4 and forms authentication, this is the best article I've found on the subject: http://weblogs.asp.net/jgalloway/archive/2012/08/29/simplemembership-membership-providers-universal-providers-and-the-new-asp-net-4-5-web-forms-and-asp-net-mvc-4-templates.aspx Some quotes from the article: “The ASP.NET Web Pages team designed SimpleMembership to (wait for it) simplify the task of dealing with membership” "WSAT is built to work with ASP.NET Membership, and is not compatible with Simple Membership. There are two main options there: Use the WebSecurity and OAuthWebSecurity API to manage the users and roles Create a web admin using the above APIs Since SimpleMembership runs on top of your database, you can update your users as you would any other data - via EF or even in direct database edits (in development, of course)" “If you want to use an existing ASP.NET Membership Provider in ASP.NET MVC 4, you can't use the new AccountController. You can do a few things:” “Universal Providers do not work with Simple Membership.” ~ this post (look for Bob.at.SBS’s answer) says Universal Providers is not needed for MVC 4 to work in Azure)   I've been trying to figure out the Forms Authentication in MVC4. It's different than the past approach (aspnet_regsql). If you do file new project -> MVC 4 -> internet application, you get a really nice template with the controller and model setup for you. However, the tables are different than using aspnet_regsql and the ASP.Net Configuration tool (WSAT) wasn’t connecting to the data I had (it was creating an App_Data/aspnet.mdf file, which I didn’t see right away). Points of Note The database tables are created in the SimpleMembershipInitializer class, when you first run your app using Entity Framework 5 migration functionality. The tables created are webpages_Membership, webpages_OAuthMembership, webpages_Roles, webpages_UsersInRoles, UserProfile. Web.config settings don’t seem to be needed.   Scott Hanselman on Universal Providers was also useful if not somewhat out dated. Universal Providers and SimpleMembership are not compatible. http://www.asp.net/web-pages/tutorials/security/16-adding-security-and-membership – walk-through

    Read the article

  • Updating shared files across computers

    - by murgatroid99
    I have a file server running Windows Server 2008 and a couple of laptops running Windows 7 on a network. There are a large number of files that all users will need access to. My plan is to have the files on both the server and the laptops because the users will need to access the files in places with no Internet access. I also want any changes made to the files on any of the laptops to propagate to the server and then propagate to the other laptops whenever they connect to the network. Should I do this with a scheduled batch script with a few xcopy commands or is there a better way to do it?

    Read the article

  • Emailing Service: To or Bcc?

    - by Shelakel
    I'm busy coding a reusable e-mail service for my company. The e-mail service will be doing quite a few things via injection through the strategy pattern (such as handling e-mail send rate throttling, switching between Smtp and AmazonSES or Google AppEngine for e-mail clients when daily quotas are exceeded, send statistics tracking (mostly because it is neccessary in order to stay within quotas) to name a few). Because e-mail sending will need to be throttled and other limitations exist (ex. max recipient quota on AmazonSES limiting recipients to 50 per send), the e-mails typically need to be broken up. From your experience, would it be better to send bulk (multiple recipients per e-mail) or a single e-mail per recipient? The implications of the above would be to send to a 1000 recipients, with a limit of 50 per send, you would send 20 e-mails using BCC in a newsletter scenario. When sending an e-mail per recipient, it would send 1000 e-mails. E-mail sending is asynchronous (due to inherit latency when sending, it's typically only possible to send 5 e-mails per second unless you are using multiple client asynchronously). Edit Just for full disclosure, this service won't be used by or sold to spammers and will as far as possible automatically comply with national and international laws. Closed< Thanks for all the valuable feedback. The concerns regarding compliance towards laws, user experience (generic vs. personalized unsubscribe) and spam regulation via ISP blacklisting does make To the preferred and possibly the only choice when sending system generated e-mails to recipients.

    Read the article

< Previous Page | 546 547 548 549 550 551 552 553 554 555 556 557  | Next Page >