Search Results

Search found 22633 results on 906 pages for 'service accounts'.

Page 698/906 | < Previous Page | 694 695 696 697 698 699 700 701 702 703 704 705  | Next Page >

  • How can I forward certain emails based on header information with Postfix?

    - by Jason Novinger
    We receive service requests via a particular email. The request is then forwarded to other addresses, using an entry in virtual_alias_maps. Upon seeing the word "EMERGENCY" in the subject line of a request to this email, I would also like to forward this to another address (an alias of our administrator's SMS email addresses). I think I can accomplish this with header checks and the REDIRECT command. However, REDIRECT only sends it to the redirected address, not the forwarded addresses. In the case of "EMERGENCY" I would like it to go to the redirect address and the original forwarded addresses. I am fairly new to Postfix and I feel like I am missing something here. Any suggestions?

    Read the article

  • Installing packetfence

    - by BrNathan
    I asked in the Ubuntu area, but thought I would get a better answer here. Was anyone able to install packetfence on Ubuntu 10? I tried a tutorial, but didn't have any luck. Some of the services installed and are working apache with php, snort, pfdetect, and pfdhcplistener. I can even get info with pfcmd node view all, but for the life of me I can't get it to work with apache2. When I run pfcmd service pf start I also get an error uninitialized value $_[7] in join at /usr/local/pf/lib/pf/class.pm line 170

    Read the article

  • Services of virtual machines

    - by RredCat
    I am looking a way to establish dev machine on the cloud. I want to have access for development from different places. I don't want to play with sync brunch and so on. It would be Lubuntu with 1G or 512M memory and I want to have way to setup my image. What I have found in the current moment: Amazon EC2 service Azure Virtual Machines I am pretty sure that there should be more services like this. I hope to find one specified for this purpose. I have experience to work in such way and I like it. Unfortunately it was server of certain company for projects of this company and I can't use it for my private aims. Could anybody suggest me anything?

    Read the article

  • Is it possible for a router to "go bad" with time?

    - by JQAn
    I've been having problems with my internet connection over the past weeks (intermittent disconnections, slow transfers, etc), and my provider keeps telling me that the problem is not on their end. I have cablemodem with a wifi router (this router was not provided by them). The router is quite old (DIR-300), so I'm starting to wonder if it could be the issue and if I should replace it. Is it possible that it is the cause? Can they become so outdated that they cause intermittent interruptions of service? If I reset the modem and the router, they work fine for a few hours, but the problems starts again after a while.

    Read the article

  • strange memory usage pattern on windows server 2008 on login through remote desktop..

    - by headsling
    I'm running Windows Server 2008 Datacenter Service Pack 2 on a VM Ware instance with 10Gb ram allocated. I'm not running IIS or SQL Server. Under 'normal' conditions, the machine uses ~5.5Gb of memory. However, when I login to the server through remote desktop, the memory usage slowly climbs up to 9.8Gb of memory in use. After several minutes the memory slowly creeps back down to the 5.5Gb mark. I've tried killing all the processes associated with my login, on login, barring the taskmanager without success, and I can't see any process that is growing in memory usage when the memory is increasing. I'm assuming this is some system level cache that is growing / shrinking... but why is it doing this?

    Read the article

  • WAMP: Apache refusing connections outside the network

    - by JoeWolf
    I have wamp installed. I ran the server, everything is running fine from localhost and my local ip address. I forwarded port 80 on my router. Whenever I try to access the server from the outside, using my real ip, it doesn't work and timeouts. I though port forwarding is not working, forwarded another port for different service and it went through, so the problem is with apache. I checked the error log, didn't find any errors. Skype is off. Any tips what could be causing this? Thanks!

    Read the article

  • disbale ssh for bnroot as root account

    - by user2916639
    i am beginner with centos - Linux i have dedicated server . my root username is bnroot . now i am taking ssh using this user. i want to disable ssh for bnroot. i have created user user name welcome i want take ssh login by welcome user then i ll use su - bnroot to get root privileges. i have set PermitRootLogin no , AllowUsers welcome IN /etc/sshd_config and after restarting sshd service . i take ssh login by welcome use then it is ok. but when i use su bnroot its prompt to password and i enter right passowrd it show su: incorrect password , i dont know where i am wrong . please help me here. changes i done - /etc/ssh/sshd_confid PermitRootLogin no AllowUsers welcome /etc/sudoers welcome ALL=(ALL) ALL getting error in /var/log/secure unix_chkpwd[666]: password check failed for user (bnroot) su: pam_unix(su:auth): authentication failure; logname=ewalletssh uid=503 euid=500 tty=pts/1 ruser=ewalletssh rhost= user=bnroot please let me know where i am wrong

    Read the article

  • How to extend a file definition from an existing module in the node?

    - by c33s
    I use an older version of the example42 mysql module, which defines the mysql.conf file but not its content. Mmy goal is to just include the mysql module and add a content definition in the node. class mysql { ... file { "mysql.conf": path => "${mysql::params::configfile}", mode => "${mysql::params::configfile_mode}", owner => "${mysql::params::configfile_owner}", group => "${mysql::params::configfile_group}", ensure => present, require => Package["mysql"], notify => Service["mysql"], } ... } node xyz { include mysql File["mysql.conf"] { content => template("mymodule/mysql.conf.erb")} } The above code produces a "Only subclasses can override parameters" What is the correct way to just add a content definition to an existing file definition?

    Read the article

  • Open-sourcing a proprietary library without certain features

    - by nha
    I hope I'm in the right place to ask that. I have a question regarding the practice of open-sourcing a proprietary library that we built and use at work. The licence will probably be MIT. I like the idea, but here comes the unusual part : I have been tasked to remove some of the most advanced features. Those will remain on our servers, available as a service. We will open-source the (JavaScript in case it is of interest) library, along with a minimal associated server code. I am not asking a question about the technical problems (I imagine we will have to maintain and synchronize somehow different repositories, maybe with incompatible pull requests, but this for stack overflow). What I would like to know is: How that would be perceived by the community at large ? Does it risk killing the eventual interest in this library? I don't personally know of any library that works like that. I'm pretty sure it is possible however, but any evidence of such a library is welcome (successful if possible). That's also because I'd like to see how they present it. More importantly, what could be the rationale for/against it? I'm not sure I understand the consequences of doing it so.

    Read the article

  • How do synchronize two folders in Windows 7 in real-time?

    - by acme
    I want Windows 7 to synchronize two folders in real time (maybe running a service that monitors a folder)? Basically I want to monitor a folder and synchronize each change (new files, changed files, deleted files) to another drive. It has to be in real time, so it gets synchronized instantly when a change happens. A one-direction synchronisation is enough. I tried Microsofts SyncToy, but it does only syncing by hand or scheduled. Can this be achieved with Windows 7 itself or does anyone know a freeware application for this?

    Read the article

  • TCP and fair bandwidth sharing

    - by lxgr
    The congestion control algorithm(s) of TCP seem to distribute the available bandwidth fairly between individual TCP flows. Is there some way to enable (or more precisely, enforce) fair bandwidth sharing on a per-host instead of a per-flow basis on a router? There should not be an (easy) way for a user to gain a disproportional bandwidth share by using multiple concurrent TCP flows (the way some download managers and most P2P clients do). I'm currently running a DD-WRT router to share a residential DSL line, and currently it's possible to (inadvertently or maliciously) hog most of the bandwidth by using multiple concurrent connections, which affecty VoIP conversations badly. I've played with the QoS settings a bit, but I'm not sure how to enable fair bandwidth sharing on a per-IP basis (per-service is not an option, as most of the flows are HTTP).

    Read the article

  • How do I check if a program can potentially be a virus?

    - by acidzombie24
    I am running Windows XP in a VM. I want to download a few applications and install the one by one and check if they potentially can be a virus. I assume virus would need to add something to the startup folder, or the application in the startup section in the registry or add a service. What else might it do to become active? Anyway, how can I check to see if a program may be a virus? I use hijack this to get a list of processes and I simply compare it from before I installed to after and see if there's anything different. Is this good enough? My main OS is Windows 7 but I do not have that in a VM and don't see a reason to test with that.

    Read the article

  • Nginx works on my linux machine but is not accessible from other computers in my local network

    - by crooveck
    In my LAN network I have a server with Scientific Linux (RedHat or Fedora based distro), I've done yum install nginx but the welcome page is not accessible from other computers in my network. When I do telnet open localhost 80 and then GET / HTTP/1.0 I get some html code from nginx, so it's running for sure. But when I want to connect remotly, doing telnet open 192.168.3.130 80 I get: Trying 192.168.3.130... telnet: Unable to connect to remote host: No route to host So I assume that there is something wrong with my network settings, maybe iptables or something else? Next step, I turned off iptables: service iptables stop and it helped, now I can connect remotely using telnet. So I think, I need to fix my iptables rules. I did some googling and found this rule -A INPUT -m state --state NEW -m tcp -p tcp --dport 80 -j ACCEPT but it still didn't allow me to connect remotely when iptables is up. Can someone please help me setting a proper iptables configuration?

    Read the article

  • Unistalled C++ Redistributables on Windows Vista

    - by VusP
    Windows Vista Service Pack2 32 bit I uninstalled C++ re-distrubutables (was not aware they are necessary for some applications to run) and I am facing error Cannot start application because the side-by-side configuration is not correct when i start applications like Avast, Revo Uninstaller etc. I came across this page. But the downloaded vcredist_86.exe doesn't seem to do much. It extracts itself and that's it nothing after that. No error nothing. I just finished installing 300+ MB updates on my system, so i don't want to use the System Restore option [And, I don't know which Restore Point to revert to]. Any other way to get this done?

    Read the article

  • Does (should?) changing the URI scheme name change the semantics?

    - by Doug
    If we take: http://example.com/foo is it fair to say that: ftp://example.com/foo .. points to the same resource, just using a different mechanism for resolving it (and of course possibly a different representation, but perhaps not)? This came to light in a discussion we were having surrounding some internal tooling with Git. We have to process some Git repositories, and they come to use as "git@{authority}/{path}" , however the library we're using to interface with them doesn't support the git protocol. I suggested that we should make the service robust in of that it tries to use HTTP or SSH, in essence, discovering what protocols/schemes are supported for resolving the repository at {path} under each {authority}. This was met with some criticism: "We don't know if that's the same repository". My response was: "It had better be!" Looking at RFC 3986, I see this excerpt: URI "resolution" is the process of determining an access mechanism and the appropriate parameters necessary to dereference a URI; this resolution may require several iterations. To use that access mechanism to perform an action on the URI's resource is to "dereference" the URI. Which makes me think that the resolution process is permitted to try different protocols, because: Although many URI schemes are named after protocols, this does not imply that use of these URIs will result in access to the resource via the named protocol. The only concern I have, I guess, is that I only see reference to the notion of changing protocols when it comes to traversing relationships: it is possible for a single set of hypertext documents to be simultaneously accessible and traversable via each of the "file", "http", and "ftp" schemes if the documents refer to each other with relative references. I'm inclined to think I'm wrong in my initial beliefs, because the Normalization and Comparison section of said RFC doesn't mention any way of treating two URIs as equivalent if they use different schemes. It seems like schemes named/based on IP protocols ought to have this notion, at least?

    Read the article

  • trying to upgrade memory

    - by user214876
    I've been using Ubuntu on my laptop for awhile now. Not quite used to it yet. I've got a Acer Aspire with an orig 4 gig mem/500 gig HDD. Running 12.04 presently 32 bit sys. I have the 13.04 upgrade disc and want to upgrade my memory to 8 Gig. Everytime I install the 8 gig memory, the system won't boot to either version. I downloaded the 64 bit version of both versions of Ubuntu but no results yet. Can anyone offer a suggestion here? I'm kinda lost. Additional Information: The memory was purchased through Acer/Kingston. Recommended for this computer. I watched the video on installing it, so I doubt it's installed wrong. (There's only one way of putting it in). I swapped Op Sys, from Ubuntu 12.04 to 13.04 to 13.10 and now to Xubuntu 13.10 64 bit version. I'm still not having any luck with this upgrade. Would it be necessary to upgrade the CPU? It's just a thought, I don't know what else could keep me from utilizing the new memory. Additional Information: Called Kingston this afternoon, they are sending replacement lower density memory modules 2/4 gig - 8 gig. Tech service says I need to upgrade BIOS to utilize new memory install VIA Dos since it is no longer a windows system. I'm not sure how to go about that but it's a learning process I can live with. Thank you all for your help/support. I realize this isn't a Ubuntu problem but each new user of this op sys, seems to share simular problems and maybe someone can use this info to their advantage.

    Read the article

  • All MySQL Databases lost overnight

    - by Iain
    After a call from a customer to say that his website is down, I find that MySQL on our RackSpace Cloud Windows 2008 server was not running. I restarted MySQL but got the 'Access denied for user' error in the browser for all websites with MySQL database. When I look in MySql Server 5.5/data there are no folders other than mysql and performance_schema. It appears all the databases and data have been wiped. Does anyone know what might have happened and where the data has gone? To top that I just found this server is missing from our backup service. ps appears to be after windows update at 4:01 this morning.

    Read the article

  • Why doesn't my cron.d per minute job run?

    - by Travis Griggs
    I have thrown a bunch of darts trying to get a python script of mine to execute every minute. So I thought I'd simplify it to just do the "simplest thing that could could possibly work" once per minute (I'm running debian/testing). I created a single line file in /etc/cron.d/perminute: * * * * * /bin/touch /home/me/ding_dong It's owned by root, and executable (not sure if either of those matter). And then I did: sudo service cron reload And then sit back and start running ls -ltr again and again in my home directory (/home/me). But my ding_dong file never shows up. I know if I do a sudo /bin/touch /home/me/ding_dong, it shows up right away. Obviously missing something stupid here.

    Read the article

  • Ping or accessing WAN IP from LAN results in failure on only one box

    - by ComputerUserGuy
    Morning/evening gents. I purchased a radical domain name today to set up a name for my services and to set up SSL. I configured the SSL fine and all but when I went to my website I couldn't connect. I can connect to the site with any other device in my house and my friend can connect to it as well from outside of the LAN. I am hosting the services with my computer and I can't access the service. Whenever I ping it using the command prompt I get a result of "General Failure.". It saddens me that they couldn't make a better message as it kind of brings me down. I'm not sure what's the deal here as I have all of my firewalls down and my ports are forwarded. Running Windows 7. Thanks for the assistance chaps.

    Read the article

  • Is there a way to edit an existing nautilus (file manager) bookmark?

    - by C.W.Holeman II
    Is there a way to edit an existing nautilus (fie manager) bookmark? Invoke from Linux command line: $ nautilus Activate connection editor: File>Connect To Server...> Complete entries in the pop up: Service Type: [WebDAV (HTTP)] Server: [localhost] Port: [8001] Folder [webdav] Username: [test] [x] Add bookmark Bookmark name: [/dav] <Connect> Then in the left column of the main window the new connection and bookmark exist: Places ------------------- ausername Desktop File System Network WebDAV on localhost Trash -------------------- /dav Right click on "/dav" pop up menu: Open Open in New Tab Open in New Window ------------------ Remove Rename... There is no option for editing.

    Read the article

  • /dev/null file became regular file

    - by user197719
    In our production server suddenly /dev/null became a regular file and due to this sshd service got stopped and not able to login the server. And also we tried to the below steps to configure back to character device file, rm -rf /dev/null mknod /dev/null c 1 3 As soon as we run the rm command /dev/null is being re-created as a regular file before mknod can run. We can't figure out how this happening and which component is creating this file. So until we solve this issue we are unable to create /dev/null as character device file.

    Read the article

  • Hardware needed for 2000 users? [closed]

    - by Trcx
    I have school assignment that is fairly well defined, requiring us to come up with a plan for an environment serving dynamic web applications to 2000 users, and should be able to scale up to six thousand. I have done plenty of research as far as load balancing, redundancy, UPSs, etc, but am having a hard time figuring out how much hardware is actually needed in the way of physical servers, ram, processing power, etc. The assignment states that the server will have a lot of dynamic code, email, and a database are required, all utilizing the appropriate microsoft service (MS SQL, Exchange, IIS). I already plan on splitting them out on to separate servers, but can't even fathom the hardware requirements of something that large scale. Could someone with experience weight in on this, or point me two some good articles?

    Read the article

  • How to disable/destroy forever Chrome's "print preview" option?

    - by VeryVito
    This question seems to come up a lot (pretty much every time a new version of Chrome is released). Previous answers such as these no longer apply (or don't work for Mac), however: How do I get the "old style" system print dialog for Chrome on Windows? Disable Chromes Ctrl+P handling of printing Sadly, Google seems intent on shoving this broken preview screen down our throats (The thought of someone not want to use their nonstandard, feature-poor alternative to a systemwide service is inconceivable to them, apparently), and the "disable" flag no longer seems to exist in recent versions. Anyone know how to disable it in modern versions of the browser, which no longer include this option under "chrome://flags?" (OS-X specifically?)

    Read the article

  • What is and what is not replicated in a glassfish cluster with a mod_jk load balancer?

    - by Navigateur
    I have a Glassfish (3.1.2) cluster over 2 computers as nodes, with a mod_jk load balancer. Are servlet instance variables replicated perfectly? If not, how do I make sure it is? Are all actions, including method calls and disk writes, replicated perfectly? If not, how do I make sure they are? These may seem like stupid questions, but I'm not seeking "load balancing" as much as I am seeking exact replication to enable future upgrading without any service interruption. How do I achieve this if it is not already the case?

    Read the article

  • A lot of connections to port 6881 - some new attacks or what ?

    - by stoleto
    Ok so i am the admin of a small network with a web server and only the web server has a direct connection to the internet, the rest of the network are connecting through another place. I was inspecting the traffic on the server with tcpdump, and i found a LOT of connections from different IP addresses to port 6881. All ports on my machine are blocked except those who are really needed for a web server (like port 80), so i checked it out and confirmed that 6881 and the rest ports are in filtered(firewalled) state. Why all those ips are continuously trying to make connection to the server on port 6881 no matter it's not open at all ? Is this some new kind of attack or maybe there's some new exploit (maybe 0day?) for some service running on 6881 ? AFAIK on 6881 operate the bittorrent and similar, so really, what's the deal ? It would be nice if someone clarifies me some things.

    Read the article

< Previous Page | 694 695 696 697 698 699 700 701 702 703 704 705  | Next Page >