Search Results

Search found 14544 results on 582 pages for 'ssh config'.

Page 322/582 | < Previous Page | 318 319 320 321 322 323 324 325 326 327 328 329  | Next Page >

  • Improving server security [closed]

    - by Vicenç Gascó
    I've been developing webapps for a while ... and I always had a sysadmin which made the environment perfect to run my apps with no worries. But now I am starting a project on myself, and I need to set up a server, knowing near to nothing about it. All I need to do is just have a Linux, with a webserver (I usually used Apache), PHP and MySQL. I'll also need SSH, SSL to run https:// and FTP to transfer files. I know how to install almost everything (need advice about SSL) with Ubuntu Server, but I am concerned about the security topic ... say: firewall, open/closed ports, php security, etc ... Where can I found a good guide covering this topics? Everything else in the server... I don't need it, and I wanna know how to remove it, to avoid resources consumption. Final note: I'll be running the webapp at amazon-ec2 or rackspace cloud servers. Thanks in advance!!

    Read the article

  • Encyption in DATA access block

    - by Sathish
    I am using enterprise library DATA access block in my application and now i want to Encrypt the connection string and store it in the Config file and consume it in my application after decrypting the same. How can i do this

    Read the article

  • using Linux vncviewer

    - by Darkoni
    when i am connecting to VNC server using wine on linux $ wine vncviewer.exe i have to enter: VNC Server: 1.1.1.21 Proxy/Reapeter: 195.29.18.33:1234 and then, when i connect, on top there is txt: 1.1.1.21:5900 (195.29.18.33:1234) mine question is: how to connect using vncviewer ? what to put in VNC_VIA_CMD ? $ export xlocalPort=1234 $ export xremoteHost=1.1.1.21 $ export xremotePort=5900 $ export xgateway=195.29.18.33 $ export VNC_VIA_CMD="/usr/bin/ssh -f -L $xlocalPort:$xremoteHost:$xremotePort $xgateway sleep 20" $ vncviewer $xremoteHost -via $xgateway and i get error: unable connect to socket: Connection refused (111) i was trying to help myself with page http://www.tightvnc.com/vncviewer.1.php Please help, couse i need to use "native" linux vncviewer installed by $ yum install tigervnc tigervnc.i686 0:1.0.90-0.13.20100420svn4030.fc13

    Read the article

  • Howto find internal IP server by external IP

    - by HWTech
    I've got 12 servers in datacenter, but can login by SSH into one of them (facade server), other servers available only from it. In hosts file we have ip list each of available servers. milkov@devel:/var/www/davel$ cat /etc/hosts 192.168.1.4 data1 192.168.1.7 data2 192.168.1.5 bground1 192.168.1.6 bground2 192.168.1.10 frontend1 192.168.1.11 frontend2 ... Also I've domain megaplan.tvigle.ru (IP 79.142.100.36). Question: How to know which one of servers serve this domain? How to find servers internal ip-address by external IP. PS: Sorry about my Eng. lng

    Read the article

  • Are periodic full backups really necessary on an incremental backup setup?

    - by user2229980
    I intend to use an old computer I have as a remote backup server for myself and a few other people. We are all geographically separated, and the plan is to do incremental daily backups using rsync and ssh. My original idea was to make one initial full backup then never again have to deal with the overhead of doing it, and from that moment on only copy the files changed since the last backup. I've been told that this could be bad, but I fail to understand why. Since each snapshot is comprised of hard links to the unchanged files plus the original changed ones, isn't it going to be identical to a new full backup? Why would I want to make another full backup?

    Read the article

  • Nomachine 4 for X forwarding

    - by Yair
    I have been using nomachine nx client to connect from my mac to an ubuntu server for a while now and it has been a great experience. The most useful feature for me was the option to open up just one application on the remote machine, instead of a full remote desktop connection. I used to to open a terminal on the remote machine. Basically it was a much faster, much better replacement for ssh -X. All was great until I upgraded to the new version - nomachine 4. In this version I can not find that option. I have to run a full remote desktop session, which slows things down and is also much less convenient for my work. Was this option removed from the client? Or is it hiding somewhere in there and I just can't find it?

    Read the article

  • Why is scp not overwriting my destination file?

    - by Noli
    I'm trying to back up a file via the command scp /tmp/backup.tar.gz hostname:/home/user/backup.tar.gz When I run it, the scp progress bar shows up and it looks like its transferring the file, however when I log into the destination server to check the file, the timestamp and filesize haven't changed from the older version, so it looks like scp didn't overwrite the old file at all. It only sees to work when I manually delete the file from the destination server. I'm running ubuntu, and this is happening on two servers: one cygwin ssh, and one fedora core 3. Anyone have any idea why this is happening? I thought scp would ONLY overwrite existing files.. Thanks

    Read the article

  • Multi Process Configuration

    - by user200937
    Hi, I have a product built out of multiple processes. Each process uses internally commons configuration. Does anyone have an idea how to manage the config? I.e. we do not want to duplicate variables so each process will be able to read them. Additionally, DB solution is no good, as we do not want to be dependent on DB for something like configuration. Thanks Yair

    Read the article

  • Asp Dot Net : IHttpModule + m_context.Server.Transfer = session state error

    - by tinky05
    I have an IHttpModule that implements IRequiresSessionState. The session state is at "on" on the page directive and I also added it to the web.config. In the method "OnBeginRequest" in the IHttpModule, I make a Server.Transfer but I get the error : Session state can only be used when enableSessionState is set to true, either in a configuration file or in the Page directive. When I access the page directly or with a Response.Redirect, there is no error. Any idea?

    Read the article

  • Ubuntu - Automatically mount external drives to /media/LABEL on boot without a user logged in?

    - by endolith
    This question is similar, but kind of the opposite of what I want. I want external USB drives to be mounted automatically at boot, without anyone logged in, to locations like /media/<label>. I don't want to have to enter all the data into fstab, partially because it's tedious and annoying, but mostly because I can't predict what I'll be plugging into it or how the partitions will change in the future. I want the drives to be accessible to things like MPD, and available when I log in with SSH. gnome-mount seems to only mount things when you are locally logged into a Gnome graphical session.

    Read the article

  • How do I make webmin secure?

    - by Josiah
    I want to install webmin to make server administration tasks on my Ubuntu 10.4 server easier. However I'm very nervious about having that kind of power accessable over the web. So I want to know how to secure webmin so that it can't be used to comprimise my server. So far here are my ideas, but I still don't feel comfortable: Make webmin accessable from only the localhost (how?) Use SSH tunneling to access the webmin server whenever I need to administrate it Can anyone help me with instructions on making webmin accessable only from the localhost? What other ways can I make webmin secure?

    Read the article

  • How to get the list of price offers on an item from Amazon with python-amazon-product-api item_looku

    - by miernik
    I am trying to write a function to get a list of offers (their prices) for an item based on the ASIN: def price_offers(asin): from amazonproduct import API, ResultPaginator, AWSError from config import AWS_KEY, SECRET_KEY api = API(AWS_KEY, SECRET_KEY, 'de') str_asin = str(asin) node = api.item_lookup(id=str_asin, ResponseGroup='Offers', Condition='All', MerchantId='All') for a in node: print a.Offer.OfferListing.Price.FormattedPrice I am reading http://docs.amazonwebservices.com/AWSECommerceService/latest/DG/index.html?ItemLookup.html and trying to make this work, but all the time it just says: Failure instance: Traceback: <type 'exceptions.AttributeError'>: no such child: {http://webservices.amazon.com/AWSECommerceService/2009-10-01}Offer

    Read the article

  • Hadoop in a RESTful Java Web Application - Conflicting URI templates

    - by user1231583
    I have a small Java Web Application in which I am using Jersey 1.12 and the Hadoop 1.0.0 JAR file (hadoop-core-1.0.0.jar). When I deploy my application to my JBoss 5.0 server, the log file records the following error: SEVERE: Conflicting URI templates. The URI template / for root resource class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods and the URI template / transform to the same regular expression (/.*)? To make sure my code is not the problem, I have created a fresh web application that contains nothing but the Jersey and Hadoop JAR files along with a small stub. My web.xml is as follows: <?xml version="1.0" encoding="UTF-8"?> <web-app version="2.5" xmlns="http://java.sun.com/xml/ns/javaee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_2_5.xsd"> <servlet> <servlet-name>ServletAdaptor</servlet-name> <servlet-class>com.sun.jersey.spi.container.servlet.ServletContainer</servlet- class> <load-on-startup>1</load-on-startup> </servlet> <servlet-mapping> <servlet-name>ServletAdaptor</servlet-name> <url-pattern>/mytest/*</url-pattern> </servlet-mapping> <session-config> <session-timeout> 30 </session-timeout> </session-config> <welcome-file-list> <welcome-file>index.jsp</welcome-file> </welcome-file-list> </web-app> My simple RESTful stub is as follows: import javax.ws.rs.core.Context; import javax.ws.rs.core.UriInfo; import javax.ws.rs.Path; @Path("/mytest") public class MyRest { @Context private UriInfo context; public MyRest() { } } In my regular application, when I remove the Hadoop JAR files (and the code that is using Hadoop), everything works as I would expect. The deployment is successful and the remaining RESTful services work. I have also tried the Hadoop 1.0.1 JAR files and have had the same problems with the conflicting URL template in the NamenodeWebHdfsMethods class. Any suggestions or tips in solving this problem would be greatly appreciated.

    Read the article

  • Perform action based on load avg

    - by sfx
    I'm running some web applications on an debian server and have to struggle with ddos attacks sometimes. It's eating up all my resources and I can't ssh anymore into the server. An idea was to drop all connections if the load avg is too high, so there are still resources for me and accept new connections if the load avg is low enough. Since this has to work under heavy load I'm afraid a cronjob wouldn't be fast enough or take too much resources. tl;dr: Is there a way to configure the behavior if the load avg is above a specific threshold?

    Read the article

  • Open-source generic web service to database interface?

    - by Joe Strout
    I'm looking for a thin, generic layer (probably written in PHP) that sits between a database (probably MySQL) and web service clients (which could be anything). I imagine there would be config files of some sort that tell it how to map web service requests to database queries (or other DB commands). It would also need to handle authentication and authorization, of course. I've done some googling but failed to find anything that fits the bill. Can anyone suggest something like this?

    Read the article

  • Network access to VM only from host ...

    - by Jamie
    I'm trying to do some testing of Ubuntu 10.04 Beta 2 Server in a VMWare environment. The host is XP, and the VM software is VMPlayer. Problem is, I want to be able to see the VM from the network, not just from the host. I can SSH into the VM from the host, but from any other machine on the network I can't even get the VM to respond to a ping. Going the other way (from the VM out) isn't a problem at all. The VM software did an 'easy installer' so I'm not really sure what was going on with the networking. Suggestions?

    Read the article

  • How to safely remove global.asax from web service

    - by Niklas
    I have a web service asp.net project which has a global.asax with empty Application_Start and Application_End implementations. As far as I can understand, in this case it is of no use and could be removed (correct me if I'm wrong). Do I need to do anything other than delete global.asax and global.asax.cs (such as change something in web.config or in the project settings)? Just asking in order to not screw up some dependencies I'm not aware of...

    Read the article

  • Smart subdomain routing via reverse proxy

    - by Trevor Hartman
    I have two servers on my home network: OSX Server and an Ubuntu Server. I'd love to have external subdomains osx.mydomain.com point to osx and ubuntu.mydomain.com point to ubuntu. I know the normal way to do this is to have a static external IP address for each, but that's not an option as this is just my home setup. My question is: is there a way to do this with some reverse proxy trickery? OSX is currently the default entry point for all traffic. I was able to setup a reverse proxy on OSX for ubuntu.mydomain.com on port 80, so web traffic was correctly being proxied to my ubuntu. I'd like to ssh and do a bunch of other stuff though!

    Read the article

  • Trying to script rsync using pam_exec

    - by Ricky-Rose
    I'm trying to write a bash script that will execute rsync when called by pam_exec. I've tried a couple different ways, and I'm not sure what I'm doing wrong. When I try to run the script at login by adding session optional pam_exec.so /usr/bin/local/sync.sh to my sshd file, it gives me an exit code of 12. if I log in and then manually run my script, it allows me to connect to the remote server, and it lists my files, but it doesn't actually sync anything. I have tried the code below using buth $USER and $PAM_USER. $PAM_USER doesn't work at all. #!/bin/sh rsync -azv -e ssh $USER@remote_server:/home/html/$USER/ /home/html/$USER

    Read the article

< Previous Page | 318 319 320 321 322 323 324 325 326 327 328 329  | Next Page >