Search Results

Search found 20168 results on 807 pages for 'service'.

Page 593/807 | < Previous Page | 589 590 591 592 593 594 595 596 597 598 599 600  | Next Page >

  • Gathering application architecture

    - by userbb
    Suppose there is system for gathering info about system activities. There is a client part with an interface and there are agent parts that are installed on each machine. I estimate that there could be max 20 computers now. Later could be more like 50. My solutions: Agent stores data into local database e.g. sqlite. There is also a service which can be used by a client to query data. So if a client wants to display data for 50 computers, he sends a query to 50 computers. I'am on that solution now but maybe it's totally wrong. Agent stores data into local database (I don't known good one for that). There is also server (main database) and local databases are synchronized with the server. In this case, a client connects to the main database to display data. Agent sends data in realtime to main database. So same as point 2, but there is no sync. Like in point 3, but agent buffers data in local database and sends it in small chunks to main database. What is the best approach?

    Read the article

  • Stopping/Starting windows services

    - by Geek
    I have four windows services which start up automatically when the machine starts. There after, I want to restart those services every 8 hours in a particular order. For eg. Stop s1,s2,s3,s4 and than restart them in some other order like s4,s3,s2,s1. The condition is that I should wait for each service to stop completely before I stop another one. I would want to write a .BAT or some script. Is it possible to define a CRON for 8 hours, this is not there in Advanced tasks ? Can I do it using windows scheduler ? Please suggest. thanks in advance.

    Read the article

  • rsync assigns deny permission

    - by user773478
    Currently a script is used to copy files using rsync (version 2.6.9 protocol version 29) from Linux/Unix servers to W2K3 server using very basic command such as "rsync -v source_server::share_name/file_name /cygdrive///file_name" The script further makes copy of this downloaded file for other purposes. This is part of a larger middleware that is being moved to new hardware on W2K8R2 Second part of making copy of the file does not work using more recent rsync client version 3.0.7 protocol version 30 (shows up as cwRsync in add/remove programs) Reason being rsync assigns special permissions to file that includes deny. The user (service account) which downloads the file is in local admin group. The file can be copied elsewhere using rsync. It can be deleted. But cannot be opened or copied locally by same user as deny permission supersedes.

    Read the article

  • New and Improved Search Helpers Now Procurement Assistants!

    - by LuciaC
    Check out the new and improved Procurement Assistants (formerly Search Helpers). Let us guide you simply to Issue Resolution.  To access all our Procurement Search Helpers see Doc ID 1391694.2 our Procurement Information Center. Here you will see links to our Procurement Search Helpers: Assistants provide a collection of solutions based on the symptoms you enter. For example simply choose the radial button that pertains to your issue, as shown here: Then choose the additional symptom(s) that pertain and potential solution documents will be returned as shown here: Try these before logging a Service Request. Current Procurement Assistants: Doc ID Title 1361856.1     Assistant: Oracle Purchasing - Purchase Order and Requisition Approval (Search Helper) 1377764.1    Assistant: Oracle Purchasing PO Output for Communication / Supplier Notification Issues (Search Helper) 1364360.1    Assistant: Oracle Purchasing Requisition To Purchase Order (Search Helper) 1369663.1    Assistant: Oracle Purchasing Purchase Document Open Interface and API (Search Helper) 1391970.1    Assistant: Oracle Inventory Management RVTII-060 Errors in Receiving (Search Helper) 1394392.1    Assistant: Oracle Purchasing Buyer Work Center Search Helper (Search Helper) 1470034.1    Assistant: Oracle Purchasing - Document Control : Cancel and Close

    Read the article

  • NFS mount of /var/www to OS X

    - by ploughguy
    I have spent 2 hours trying to create an NFS mount from my Ubuntu 10.04 LTS server to my OS X desktop system. Objective: three way file compare between the code base on the Mac, the development system on the local Linux test system, and the hosted website. The hosted service uses cpanel so I can mount a webdisk - easy as pie - took 10 seconds. The local Ubuntu box, on the other hand - nothing but pain and frustration. Here is what I have tried: In File Browser, navigate to /var/www/site and right-click. Select share this folder. Enter sharename wwwsite and a comment. Click button "Create Share". Message says - you can only share file systems you own. There is a message on how to fix this, but the killer is that this is sharing by SMB. It will change the LFs to CR-LFs which will affect the file comparison. So forget this option. In a terminal window, run shares-admin (I have not been able to convince it to give me the "Shared Folders" option in the System Administration window - Maybe it is somewhere else in the menu, but I cannot find it) define an NFS export. Enter the path /var/www/site, select NFS enter the ip address of the iMac and save. On the mac, try to mount the file system using the usual methods - finder, command line "mount" command - not found. Nothing. Tried restarting the linux box in case there is a daemon that needs restarting - nothing. So I have run out of stuff to do. I have tried searching the documentation - it is pretty basic. The man page documentation is as opaque as ever. Please, oh please, will someone help me to get this @38&@^# thing to work! Thanks for reading this far... PG.

    Read the article

  • Need an FTP Client to run on a server and allow scheduling and not need a login to run

    - by William Todd Salzman
    I am looking at FTP clients to transfer from an external FTP Server. I need to place this FTP client on a server in the DMZ that will not be routinely logged in, so the client needs to run as a service or something like that? I need the client to be able to retrieve files from the server on a schedule (Tuesday Mornings) and drop them in a local directory for pickup by another process. I would also like the solution to be capable of performing sftp transfers. Most marketing material is geared towards the person who will be running this on their desktop, not on a server, so several of my points are never in the product specs. update DMZ can run either Windows versions or Linux versions * end update *

    Read the article

  • Repository query conditions, dependencies and DRY

    - by vFragosop
    To keep it simple, let's suppose an application which has Accounts and Users. Each account may have any number of users. There's also 3 consumers of UserRepository: An admin interface which may list all users Public front-end which may list all users An account authenticated API which should only list it's own users Assuming UserRepository is something like this: class UsersRepository extends DatabaseAbstraction { private function query() { return $this->database()->select('users.*'); } public function getAll() { return $this->query()->exec(); } // IMPORTANT: // Tons of other methods for searching, filtering, // joining of other tables, ordering and such... } Keeping in mind the comment above, and the necessity to abstract user querying conditions, How should I handle querying of users filtering by account_id? I can picture three possible roads: 1. Should I create an AccountUsersRepository? class AccountUsersRepository extends UserRepository { public function __construct(Account $account) { $this->account = $account; } private function query() { return parent::query() ->where('account_id', '=', $this->account->id); } } This has the advantage of reducing the duplication of UsersRepository methods, but doesn't quite fit into anything I've read about DDD so far (I'm rookie by the way) 2. Should I put it as a method on AccountsRepository? class AccountsRepository extends DatabaseAbstraction { public function getAccountUsers(Account $account) { return $this->database() ->select('users.*') ->where('account_id', '=', $account->id) ->exec(); } } This requires the duplication of all UserRepository methods and may need another UserQuery layer, that implements those querying logic on chainable way. 3. Should I query UserRepository from within my account entity? class Account extends Entity { public function getUsers() { return UserRepository::findByAccountId($this->id); } } This feels more like an aggregate root for me, but introduces dependency of UserRepository on Account entity, which may violate a few principles. 4. Or am I missing the point completely? Maybe there's an even better solution? Footnotes: Besides permissions being a Service concern, in my understanding, they shouldn't implement SQL query but leave that to repositories since those may not even be SQL driven.

    Read the article

  • USB Mouse doesn't work after turning on Laptop

    - by Barry
    I have a USB mouse attached to my laptop which does not work when I switch my laptop on. I have to unplug it and plug it back in before it works. When I do this no driver installation occurs (presumably because it has already done this) the usual beep sound does occur and the mouse starts working again. If my laptop goes to sleep I can move the mouse and the laptop comes back to life. In fact it works perfectly apart from this annoying niggle on startup. Can anyone shed any light as to why I have to keep unplugging and plugging my mouse back on startup? I am running Windows 7 Home Premium Service Pack 1 (64 bit) on a Toshiba Satellite L755-1LL Laptop.

    Read the article

  • How to restrict deletion of a folder on NTFS share, but still allow modify access within folder

    - by thinkdreams
    I am setting up a set of scan folders from a scanning copier device, and would like to know the best way to protect the folders (for each department) from moving or deletion, but yet still allow access for the users to modify (i.e. create/add/delete) the scanned files within the folder. Structure is: Share Name Departmental Folder User files The writing of the files initially is taken care of by a service account which has full control. We'd just like to ensure the users cannot accidentally delete the folder (which has already happened) containing all the files, etc. This is for a Windows 2003 server, NTFS permissions. Suggestions would be most appreciated.

    Read the article

  • Customer Reviews on Company Listings [closed]

    - by GSTAR
    I'm not sure if this is the right place but I am after some general advice on a feature I am looking to implement on my website. The website I have is a Wedding Directory. Here, companies can advertise in the form of a directory listing. This listing contains the company details - such as what they do and how you can contact them. There are three packages available - Basic package is free and Silver and Gold packages are charged for. Now in order to further enhance the directory, I want to enable customer reviews for each listing. This is basically whereby customers who have used a particular company's service can write a review (and rate out of 5) of their experience. The dilemma I face is that if customers are to be paying for their listings then surely they will expect to not have content on their listing that would tarnish their reputation (i.e. negative reviews). But at the same time I want to be impartial and help my site visitors to make informed choices on which companies to book when arranging their weddings. Of course I will exercise my ability to remove offensive or fake reviews but I do not want to be removing reviews just to satisfy my clients. Suppose a client pays me a premium fee to have their listing on my front page and at the top of the listings. Now suppose that client gets lost of negative reviews - the fact that they have increased visibility means that they will also get increased bad publicity. This in turn means they won't renew their package and will most likely request the listing to be taken down. So, how can I get the right balance here and keep everyone happy?

    Read the article

  • Power outage, Server 2K3 remains on "applying computer settings"

    - by syuroff
    My reward for clicking the "test" button in the APC UPS software was that it completely cut the power to my SQL server. The server promptly rebooted, the SQL service is running (verified by the app on another server that queries it), but the GUI has remained on "applying computer settings" for 20 minutes and counting, and it forbids RDC connections. Since SQL is up, it is fulfilling its key role, but it's obviously not right. What step to take next? Wait longer? Hardware is a Dell Poweredge 2850, internal RAID10.

    Read the article

  • Excel Countif external date

    - by Duall
    I am making an Excel 2010 spreadsheet to log support calls, services, and installations that each member would fill out. Due to being paid by job rather than by hour, there is a need for it to count each of these ("Call", "Service", "Install") there is in any given time span. The entry of the data itself would be in Sheet 1, and then a 'splash screen' of sorts would be in Sheet 2. Here I would like to be able to put a date range and it would display how many of each there is. I already can do the COUNTIF statement, =COUNTIF(Activity!$B:$B,"Call") but I don't know how to: a) Add in the prerequisite for a date so it only gets "Calls" within a certain time frame. b) Take the date it looks for from a cell on the splash screen.

    Read the article

  • How can I get Google to re-point its search entries to new domain?

    - by poolski
    My main .com domain registration lapsed and when I went to re-register it, I found that a domain reseller service squatted it and I've lost access to it. As I wasn't terribly keen on spending money on funding scammers and the like, I registered a .co.uk domain under the same name. Is there any way of getting Google to re-point all its indexed links to the new domain? It's been indexing my blog for a couple years now and while it's not too big a deal, I'd like to not have to start all over again. Also, searching for my site results in an old entry which is currently pointing at a "Apply for a Tax Break NOW!!!" page.

    Read the article

  • Most secure way to access my home Linux server while I am on the road? Specialized solution wanted

    - by Ace Paus
    I think many people may be in my situation. I travel on business with a laptop. And I need secure access to files from the office (which in my case is my home). The short version of my question: How can I make SSH/SFTP really secure when only one person needs to connect to the server from one laptop? In this situation, what special steps would make it almost impossible for anyone else to get online access to the server? A lot more details: I use Ubuntu Linux on both my laptop (KDE) and my home/office server. Connectivity is not a problem. I can tether to my phone's connection if needed. I need access to a large number of files (around 300 GB). I don't need all of them at once, but I don't know in advance which files I might need. These files contain confidential client info and personal info such as credit card numbers, so they must be secure. Given this, I don't want store all these files on Dropbox or Amazon AWS, or similar. I couldn't justify that cost anyway (Dropbox don't even publish prices for plans above 100 GB, and security is a concern). However, I am willing to spend some money on a proper solution. A VPN service, for example, might be part of the solution? Or other commercial services? I've heard about PogoPlug, but I don't know if there is a similar service that might address my security concerns? I could copy all my files to my laptop because it has the space. But then I have to sync between my home computer and my laptop and I found in the past that I'm not very good about doing this. And if my laptop is lost or stolen, my data would be on it. The laptop drive is an SSD and encryption solutions for SSD drives are not good. Therefore, it seems best to keep all my data on my Linux file server (which is safe at home). Is that a reasonable conclusion, or is anything connected to the Internet such a risk that I should just copy the data to the laptop (and maybe replace the SSD with an HDD, which reduces battery life and performance)? I view the risks of losing a laptop to be higher. I am not an obvious hacking target online. My home broadband is cable Internet, and it seems very reliable. So I want to know the best (reasonable) way to securely access my data (from my laptop) while on the road. I only need to access it from this one computer, although I may connect from either my phone's 3G/4G or via WiFi or some client's broadband, etc. So I won't know in advance which IP address I'll have. I am leaning toward a solution based on SSH and SFTP (or similar). SSH/SFTP would provided about all the functionality I anticipate needing. I would like to use SFTP and Dolphin to browse and download files. I'll use SSH and the terminal for anything else. My Linux file server is set up with OpenSSH. I think I have SSH relatively secured. I'm using Denyhosts too. But I want to go several steps further. I want to get the chances that anyone can get into my server as close to zero as possible while still allowing me to get access from the road. I'm not a sysadmin or programmer or real "superuser". I have to spend most of my time doing other things. I've heard about "port knocking" but I have never used it and I don't know how to implement it (although I'm willing to learn). I have already read a number of articles with titles such as: Top 20 OpenSSH Server Best Security Practices 20 Linux Server Hardening Security Tips Debian Linux Stop SSH User Hacking / Cracking Attacks with DenyHosts Software more... I have not implemented every single thing I've read about. I probably can't do that. But maybe there is something even better I can do in my situation because I only need access from a single laptop. I'm just one user. My server does not need to be accessible to the general public. Given all these facts, I'm hoping I can get some suggestions here that are within my capability to implement and that leverage these facts to create a great deal better security than general purpose suggestions in the articles above.

    Read the article

  • Windows 7: How to enable firewall disabled by global policy on a computer joined to a domain?

    - by kzen
    On a Windows 7 Enterprise 64-bit laptop joined to a corporate domain, the Windows Firewall is disabled by a global policy. Is there any way to enable the Windows Firewall in this scenario? The gpedit.msc setting Windows Firewall: Protect all network connections is inaccessible. EDIT: It appears that changing HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\gpsvc\Start value to 4 will disable the GPO and allow you to start the firewall and stop the bots from pushing cr*p to your computer... will check on Monday and if it works I'll confirm here in case someone else in my situation wonders upon this question... EDIT: It's probably better if I write a mock windows service not doing anything and name it according to what is expected to be on my box and than crete mock McCrappy executable and mock McCrappy folder structure and remove all the actual stuff... That would take a little time but would most certainly make my box completely stealthy...

    Read the article

  • How SmartDNS Works

    - by Emad
    If you travel outside the US you'll notice that most of the streaming services like Netflix, Pandora, hulu etc are blocked, usually by the service providers themselves. To get around that, people use VPN services. They basically tunnel your traffic through a US server so your requests seem like they are originating in the US. These VPN services fix this blocking problem, but make your connection slower than the normal unVPNed connection. Recently however I've come across something called SmartDNS provided by overplay.net. You pay $5 a month and you get access to their DNS servers. After you change to their DNS you get access to the blocked streaming sites, without slowing down your normal traffic like email and browsing. What I'd like to know is the technical details of how this SmartDNS works. I've done some quick research but that didn't turn up anything of substance. Anybody out there knows?

    Read the article

  • Windows 2012 remote access can't connect

    - by Gelo Volro
    I have installed Windows Server 2012 for about 9 months ago. Earlier I could connect to my server with an external IP perfectly, but the license for the remote desktop services has ended. First, I thought it was the reason, why I couldn't use my server as RDP-server, to which users may connect. But, than in web I read, that it's possible to use windows native features, just to remove the RDP services, which are trial and the server will work as RDP-server, but with some limitations ( for e.g. such free RDP service may accept only 1 connection and other connection will be disconnected etc ). Is it true, that it's possible? If yes, please give a piece of advice, how should I repair it. Because I don't want to use TightVNC or other stuff. Thanks!

    Read the article

  • SQL Server Configuration Scripting Utility Release 9

    - by Bill Graziano
    There’s another update to my little utility to script a SQL Server’s configuration.  I use this for two purposes.  First, I use it to keep my database mirroring servers up to date.  Second, I capture the output in a version control system and keep that for historical reference. In release 3.0.9 I made the following changes: Rewrote the encrypted trigger scripting.  It will now list the encrypted triggers in a comment in the table script but can’t actually script them. It now scripts any server event notifications. You can script a single database using the /scriptdb flag.  Please note that it will also script the instance and system databases when it does this. It will script any user-defined endpoints.  This will capture your mirroring endpoints and more importantly any service broker endpoints. It will gracefully skip database mail on the Express Edition. It still doesn’t support SQL Server 2012.  I think that’s the next feature to add though.

    Read the article

  • Efficient algorithm for Virtual Machine(VM) Consolidation in Cloud

    - by devansh dalal
    PROBLEM: We have N physical machines(PMs) each with ram Ri, cpu Ci and a set of currently scheduled VMs each with ram requirement ri and ci respectively Moving(Migrating) any VM from one PM to other has a cost associated which depends on its ram ri. A PM with no VMs is shut down to save power. Our target is to minimize the weighted sum of (N,migration cost) by migrating some VMs i.e. minimize the number of working PMs as well as not to degrade the service level due to excessive migrations. My Approach: Brute Force approach is choosing the minimum loaded PM and try to fit its VMs to other PMs by First Fit Decreasing algorithm or we can select the victim PMs and target PMs based on their loading level and shut down victims if possible by moving their VMs to targets. I tried this Greedy approach on the Data of Baadal(IIT-D cloud) but It isn't giving promising results. I have also tried to study the Ant colony optimization for dynamic VM consolidating but was unable to understand very much. I used the links. http://dumas.ccsd.cnrs.fr/docs/00/72/52/15/PDF/Esnault.pdf http://hal.archives-ouvertes.fr/docs/00/72/38/56/PDF/RR-8032.pdf Would anyone please clarify the solution or suggest any new approach/resources for better performance. I am basically searching for the algorithms not the physical optimizations and I also know that many commercial organizations have provided these solution but I just wanted to know more the underlying algorithms. Thanks in advance.

    Read the article

  • join ZFS/Solaris to windows AD 2003/2008 domain

    - by user95587
    I have a client trying to join his newly updated ZFS/Solaris box to my Windows AD 2003/2008 domain. Here is the command he is using and the error he is getting; Console: root@xxx:/etc/inet# smbadm join -u USER DOMAIN After joining DOMAIN the smb service will be restarted automatically.Would you like to continue? [no]: yes Enter domain password: Joining DOMAIN ... this may take a minute ... failed to join DOMAIN: UNSUCCESSFUL Please refer to the system log for more information. From /var/adm/messages: Sep 22 10:12:00 xxx smbd[593]: [ID 702911 daemon.error] smbrdr_exchange[116]: failed (-3) Sep 22 10:12:01 xxx smbd[593]: [ID 232655 daemon.notice] ldap_modify: Insufficient access Sep 22 10:12:01 xxx smbd[593]: [ID 898201 daemon.notice] Unable to set the TRUSTED_FOR_DELEGATION userAccountControl flag on the machine account in Active Directory. Please refer to the Troubleshooting guide for more information. Sep 22 10:12:01 xxx smbd[593]: [ID 526780 daemon.notice] Failed to establish NETLOGON credential chain Sep 22 10:12:01 xxx smbd[593]: [ID 871254 daemon.error] smbd: failed joining DOMAIN (UNSUCCESSFUL)

    Read the article

  • Terminal Server 2003 Login Issue - Insufficient system resources exist to complete the requested ser

    - by LP
    Afternoon. We have three identical terminal servers running Windows Server 2003 SP2, on these servers there are about 250 concurrent users logged on. We're running Roaming Profiles on a central server running Active Directory which cache the profiles locally on each terminal server as well. When one, and just that one, user tries to login she gets this error message (roughly translated from Swedish): "You could not be logged in becouse your principle could not be registered. Check that you're connected to the network or ask your administrator Insufficient system resources exist to complete the requested service." Anyone have an idea about this? I'm stumped ... Best Regards LP

    Read the article

  • Does Google Tv (NSZ-GS7) work with HTPC?

    - by Mark Trinh
    I was wondering if my Google Tv will seamlessly work with my HTPC. I'm running Windows Media Center using a InfiniTv quad tuner with a cable card. I'm able to connect my GTV with my HTPC and be able to see the GTV interface overlaid on top of media center. Unfortunately, I'm not really able to use the Tv/Movies App to look at live shows and change the channel. I have the Video device setup as Media Center PC, but then the Live Tv App doesn't work. If I change to my tv service provider (Verizon FIOS), the Live Tv App works, but then it won't be able to change the channel.

    Read the article

  • vsftpd allow anonymous log-in

    - by user1817081
    I'm setting up a ftp server, that will allow anonymous to READ/WRITE to the server. Here is my configuration. anonymous_enable=YES local_enable=YES write_enable=YES anon_upload_enable=YES anon_mkdir_write_enable=YES xferlog_enable=YES connect_from_port_20=YES xferlog_file=/var/log/xferlog xferlog_std_format=YES ftpd_banner=Welcome to blah FTP service. listen=YES pam_service_name=vsftpd userlist_enable=NO tcp_wrappers=YES no_anon_password=YES In my /var/ftp/ i set the permission to 755. When I tried to set it to 777 i got the following error, when i tried to log in. 500 OOPS: vsftpd: refusing to run with writeable anonymous root login failed. Do i need to set up anything else to allow READ/WRITE for anonymous?

    Read the article

  • How do I debug an upstart job?

    - by Cerales
    I have the following job in /etc/init/collector: start on runlevel [2345] stop on runlevel [!2345] expect daemon exec /usr/bin/twistd -y /path/to/my/tac/file When I start the job with sudo service collector start, it hangs. If I ctrl-c and run initctl list, I see this: collector start/killed, process 616 I can't see an instance of the twistd daemon in ps, and the HTTP server it's supposed to be providing does not exist. I even tried this without 'expect daemon' and with a simple call to a one-line bash script using a script stanza, and it still doesn't work. I think I'm doing something very wrong. What could it be?

    Read the article

  • Cant Add Columns to a AD Task pad except for the top level of the domain

    - by Darktux
    We are working on Active Directory taskpads application for user management in our organization and facing stange issue. When we create a taskpad, and when we are at top level of the domain, i can click view - Add/Remove Columns and add "Pre Windows Name" (and lots of other properties) to the taskpad as columns, but when i just go 1 level down , i can only see "Operating System" and "Service Pack" ; why is it happening , isnt "Domain Admins" supposed to god access to all the things in AD domain , atleast of objects they own? It is important to have "Pre Windows 2000" Name as a column begause with out that our "Shell Command" task wont show up in taskpads, since its bound to parameter "Col<9" (which is pre qindows name). Please do let me know if any additions questions to clarify my problem.

    Read the article

< Previous Page | 589 590 591 592 593 594 595 596 597 598 599 600  | Next Page >