Search Results

Search found 2838 results on 114 pages for 'considered harmful'.

Page 45/114 | < Previous Page | 41 42 43 44 45 46 47 48 49 50 51 52  | Next Page >

  • what constitutes out-of-band access to a server?

    - by broiyan
    The first time I access my server with a new installation of Filezilla or Putty, I will get prompted that I should continue only if the RSA key shown to me is correct. The cloud provider has advice on their website that I ought to use their AJAX console to get a key out-of-band with which to compare to the one shown by Filezilla. The AJAX console is launched from a link on the cloud provider's website which requires a login. Exactly how is this AJAX console considered to be out-of-band when it obviously is not a form of physical access to the server?

    Read the article

  • SQL server environment

    - by Olegas D
    Hello I'm considering a bit of changes in current sales environment. And trying to check all cons and pros. Current situation. SQL server (quite decent HP server - server1) + backup server (smaller Dell server - server2). all sql files and sql server itself are on the server1. If something goes wrong with server1 I will have to manually move to server2. Connecting to the sql server: 1 HQ (where server located) + 4 sites through VPN. Now I'm considering 2 scenarios: Buy some storage system + update existing servers (add ram, upgrade processors) and go for VMWare ESXI. Rent a server at a datacenter + rent virtual server in case real server goes down. Also rent some space at data storage to keep SQL files there. Have anyone considered these things and maybe found some good pros/cons list? ;) Thanks

    Read the article

  • Cheapest High Available Web Server [closed]

    - by xyz
    I would like to create a high-available setup (e.g. a small cluster) for a webserver, i.e. it will run Apache, PHP and MySQL. There will be between 2-8 small websites running with only very little traffic and workload. High availability is however very important. I don't want to be dependent on 1 datacenter, so there must be a minimum of 2 servers placed in different datacenters, and if one server goes down, the user must experience no or only a minimum of downtime - and no data loss. I have considered Amazon AWS using their Elastic Load Balancing, since it is possible to buy 2 EC2 instances in 2 availability zones and set up load balancing and RDS (Multi-AZ). However this seems rather expensive. Using the AWS price calculator http://calculator.s3.amazonaws.com/calc5.html it totals to 185$/month the first year (including the free tier). Are my calculations incorrect or is there a cheaper way to make this HA setup? Best regards

    Read the article

  • How to configure HA iSCSI for Solaris 10

    - by Noah
    BACKGROUND: We have a StarWind NAS that we are currently using for High Availability storage with our Windows network. Starwind has mirrored drives and multiple ip paths, that the Windows Server combines into one HA disk store. QUESTION: How do I accomplish the same thing under Solaris 10? I've looked at ZFS but to document seems to indicate that ZFS wants to do its own raid/mirroring. I can also attach via iSCSI from Solaris and am presented with both drives being served by the Starwind NS. So, how do I configure solaris so that disk M1 and M2 are considered as a single fault tolerant drive?

    Read the article

  • Equivalent of phpMyAdmin for MSSQL?

    - by Tedd Hansen
    Is there any webinterface for administrating MSSQL similar to phpMyAdmin (for MySQL)? I want a self-service setup where developers can create a database through webinterface and upload/download backups of the database without local access. I've considered phpMSAdmin, but it hasn't had a release since 2006 so I'm not sure its worth the effort of setting it up. If there is something else (free or not-so-free) that would be great. My question is similar to this one posted 2 years ago, but no good webinterface was found back then. SQL Web Data Administrator seems interesting, but it lacks a few features - most notably creating new databases (also, not updated since 2007).

    Read the article

  • Why not install Msvcr71.dll into system32?

    - by hillu
    While looking for an authoritative source for the missing Msvcr71.dll that is needed by a few old applications, I stumbled across the MSDN article Redistribution of the shared C runtime component in Visual C++. The advice given to developers is to drop the DLL into the application's directory instead of system32 since DLLs in this directory are considered before the system paths. What can/will go wrong if I (as an administrator, not a developer) decide to take the lazy path and install Msvcr71.dll (and Msvcp71.dll while I'm at it) into the system32 directory (of 32 bit Windows XP or Windows 7 systems) instead of putting a copy in each application's directory? Is there another good solution to provide the applications with the needed DLLs that doesn't involve copying stuff to the application directories? added after first answers: I understand that incompatible API changes may have been made to the mentioned DLLs, but pretty much every mention of incompatibilities I have found using Google had to do with games or video codecs. Right now, I expect that the risk of breakage is pretty small. Am I missing something?

    Read the article

  • Cloud service to receive up to 30000 emails a minute

    - by David
    I am building a business where I want the infrastructure to be able to handle up to 30000 emails per minute during peak periods. The question is what kinds of services offer this? I expect to download the emails using SMTP or similar. I expect each email to have a total attachment size of 2 mb, and might have several attachments. I have considered utilizing Parse API from SendGrid, but I am worried because they offer this service for free. I have contacted them and I am waiting for answer. Are there any better and more suitable alternatives?

    Read the article

  • Is there a way to force Windows to recognize a network folder as a local drive, for the purposes of

    - by NoCatharsis
    I just started using the file search program Everything at work to search through documentation on our shared drives. This is after disappointments with Google Desktop and Windows Search. I love the speed of Everything, but I wish it were able to index other shared folders. My makeshift solution was to somehow force Windows to recognize the necessary shared folders as local drives, then add them to the index list. I have also considered using SyncToy, but this requires downloading all data to my drive, which could be terabytes of information - obviously not a good idea on a small company network. What would be the best solution here?

    Read the article

  • Can I create a DC without a DNS Server?

    - by onik
    So as the title says, I need to promote a standalone Win2008R2 server to a Domain Controller, and I don't a DNS Server (I think), as there will be no clients connected to the domain, it will be only used for Remote Desktop Services. Yes, I know, it's considered bad practice to install other roles on the DC, but in this case, it's necessary. Do I need to install the DNS Server, and if I do, how to make it as transparent as possible? EDIT: Seems that I need to install the DNS Server, so I can I configure it not to mess up my entire domain? For example: The server I need to promote is rdc.mydomain.com, and it has an A entry to it's IP in the current DNS, while other servers under mydomain.com are running Linux and don't need to know anything about this Windows box. The domain uses a third-party DNS and all edits and updates need to be done via a separate web page, our servers don't have write/update access.

    Read the article

  • Nagios remote monitoring: NRPE Vs. SSH

    - by sam
    We use Nagios to monitor quite a few (~130) servers. We monitor CPU, Disk, RAM and a few other things on each server. I've always used SSH to run the remote commands, purely because it requires little to no additional config on the remote server, just install nagios-plugins, create the nagios user and add the SSH key, all of which I've automated into a shell script. I've never actually considered the performance implications of using SSH over NRPE. I'm not too bothered about the load hit on the Nagios server (It's probably over-speced for what it does, it's never been over 10% CPU), but we run each remote check every 30 seconds and each server has 5 different checks performed. I assume SSH requires more resources for each check but is there a huge difference? (I.E. enough of a difference to warrant the switch to NRPE). If it's any help, we monitor a mix of physical servers (Normally with 8, 12 or 16 physical cores) and Amazon EC2 medium/large instances.

    Read the article

  • Our company claims that the DLP system can even monitor the contents of HTTPS traffic, how is this possible?

    - by Ryan
    There is software installed on all client machines for DLP (Data Loss Prevention) and HIPAA compliance. Supposedly it can read HTTPS data clearly. I always thought that between the browser and the server, this was encrypted entirely. How can software sneak in and grab this data from the browser prior to it is encrypted or after it is decrypted? I am just curious as to how this could be possible. I would think that a browser wouldn't be considered very secure if this was possible.

    Read the article

  • Rate of UDP packet loss over WLAN

    - by Martin
    While testing something with TFTP I noticed lots of timeouts (and slow speed as result) when I used my WLAN - and no problems when using a network cable. A quick test program sending/receiving UDP revealed that there are about 3-5% packets lost. While it's obvious that WLAN has to be less reliable than LAN, I have no knowledge what loss rates are considered 'normal' - and when there is a need to further investigate the network infrastructure. Are there 'typical' packet loss rates on WLAN (and other network technologies e.g. PowerLAN, WAN, ...? Thanks

    Read the article

  • Direct Access on Server 2012

    - by francisswest
    Scenario: Windows Server 2012 with the Remote Access role installed. IP set to static, registered DNS. 3 domain controllers, all running Server 2003 (I suspect this may be the issue) Done so far: DNS registered, firewall turned off after IPsec was applied. Able to ping all 3 DC's with no issues Problem: Going through the DA wizard generates this error. I am logged in as a domain admin, verified that I can ping the DC, verified IPsec allows me to contact it. Since this version hasnt been readily deployed yet, there isnt much help available online from what I can find. Any assistance ayone could provide would be greatly appreciated. I am still new to the server world as far as things are considered. I would fit, user wise, between Superuser.com and serverfault.com (junior admin) Thanks to anyone who may be able to assist!

    Read the article

  • Does using VLANs in your network infrastructure cause an appreciable decrease in performance?

    - by Peter Grace
    This is something I've never considered before and wanted the opinions of the experts. We use VLANs day in and day out for various network tasks. My modus operandi is that in general, if something supports VLANs, that port is getting trunked because it just makes a ton of sense if there's even the slightest chance you need to do more than one thing on that single link. As I ponder this, though, I'm wondering whether there's a performance penalty involved with this line of thinking? Is the impact negligible?

    Read the article

  • Browser considering www domain and without www domain different

    - by user1444680
    I've bought a domain name and hosted it. My browser is storing separate passwords for mydomain.com and www.mydomain.com, and also caching them separately. I want these two to be considered the same website. The zone records of mydomain.com are: "A" record: "@" points to the IP address of my hosting CNAME: www points to "@" As CNAME signifies alias, shouldn't browser understand (like search engines do) that the two URLs refer to the same website? Is it browser's fault? Please tell how to correct the problem? Do I need to enter some other record for www instead of CNAME?

    Read the article

  • Running OpenVZ virtual servers within a Xen XCP vritual server? Bad practice?

    - by Damainman
    I have a 1 server with 8GB RAM and 2xQuadcore Processors. It currently has the Xen XCP installed on it, and centos6.2 x64 running on a virtual machine. I have a server control panel software that I want to use and it allows the administration via a web interface for Openvz machines. My questions are: Would this be considered bad practice? Would there be a big performance hit? Should I avoid this all together or am I going about it all wrong? Thank you in advance.

    Read the article

  • MS Access ADP front end and SQL Server back end for field data collection?

    - by Brash Equilibrium
    I am an anthropologist. I am going to the field and will use a netbook to collect survey data. The survey forms will need to allow me to enter data into multiple tables, search tables, allow subforms, and be fast enough to not slow down my interview. I have considered storing the data in a SQL Server Express 2008 R2 server (there will be a lot of data) while using a Microsoft Access data project as a front end. To cut down the number of steps required to collect and store data, I'm considering using the netbook for both data storage and collection (after reading this article about SQL Server on a netbook). My questions are: (1) Is there a simpler solution that is also gratis (gratis because I already have a MS Access license from my workplace, and SQL Server Express is, obviously, free)? (2) Does my idea to store and collect data using the netbook make sense? Thank you.

    Read the article

  • What is the best ways to duplicate DVDs in bulk?

    - by Axxmasterr
    I have some instructional videos I am getting ready to release on DVD and I want to know what is the quickest and most cost effective way to produce these in bulk? I am open to both customized PC based software/hardware solutions as well as dedicated hardware appliances which perform the same function. All options considered seriously. I don't have a problem building a system for this purpose. If I build something I would prefer it have the ability to make multiple copies at once. I figure I will need to make about 300 copies initially.

    Read the article

  • Timeout ssh sessions after inactivity?

    - by Insyte
    PCI requirement 8.5.15 states: "If a session has been idle for more than 15 minutes, require the user to re-enter the password to re-activate the terminal." The first, and most obvious, way to deal with ssh sessions that are idling at the bash prompt is by enforcing a read-only, global $TMOUT of 900. Unfortunately, that only covers sessions sitting at the bash prompt. The spirit of the PCI spec would also require killing sessions running top/vim/etc. I've considered writing a */1 cron job that parses the output of "/usr/bin/w" and kills the associated shell, but that seems like a blunt instrument. Any ideas for something that would actually do what the spec requires and just lock the terminal? I've looked at away and vlock; they both seem great for voluntarily locking your terminal, but I need a cron/daemon task that will enforce locking.

    Read the article

  • How to whitelist a domain while blocking forgeries using that domain?

    - by QuantumMechanic
    How do you deal with the case of: wanting to whitelist a domain so that emails from it won't get eaten, but not having emails forged to appear to be from that domain get bogusly whitelisted whitelist_from_recvd looks promising, but then you have to know at least the TLD of every host that could send you mail from that domain. Often RandomBigCompany.com will outsource email to one or more sending companies (like Constant Contact and the like) in addition to using servers that reverse-resolve to something in its own domain. But it looks like whitelist_from_recvd can only map to one sending server pattern so that would be problematic. Is there a way to say something like "if email is from domain X, subtract N points from the spam score"? The idea would be that if the mail is legit, that -N will all but guarantee it isn't considered spam. But if it is spam, hopefully all the other failed tests will render it spam even with the -N being included.

    Read the article

  • Capture the build number for a remote-triggered Hudson job?

    - by EMiller
    I have a very simple inhouse web app from which certain Hudson builds (on another server) can be triggered remotely. I have no problem triggering the builds, but I don't know how to capture the associated build number for later reference. I'm using the buildWithParameters trigger, and the actual result of that call is just a mess of HTML - I don't believe it gives me back the build number. I started down the path of pulling the whole build list for the job (via the api), and then attempting to reconcile that list against my records - but that's much more complicated than I'd like it to be. I also considered sleeping for a few seconds after launching the job, and then grabbing the latestBuild from the Hudson api - but I'm sure that's going to go wrong at some point (someone will fire off two jobs quickly, and I'll get the association wrong).

    Read the article

  • is it okay to use random URLs instead of passwords?

    - by stew
    Is it considered "safe" to use URL constructed from random characters like this? http://example.com/EU3uc654/Photos I'd like to put some files/picture galleries on a webserver that are only to be accessed by a small group of users. My main concern is that the files should not get picked up by search-engines or curious power-users that poke around my site. I've set up an .htaccess file, just to notice that clicking on http://user:pass@url/ links doesn't work well with some browsers/email clients, prompting dialogs and warnings messages that confuse my not-too-computer-savy users.

    Read the article

  • How to migrate Lotus Notes Mail in database to Exchange public folder?

    - by elsni
    I need to migrate a Lotus Notes Mail database to Exchange. In the past, Users get mails in their Notes mail accounts and sort them manually by drag and drop in a specific folder structure in a seperate Notes mail database. This should be mapped to Exchange. I considered using outlook and a sharepoint Library mapped to outlook, but Outlook does not support drag and drop from the mail account to a standard doclib and the discussion libs do not support folders. So I think it's the easiest way to use an Exchange public folder instead of a sharepoint lib, which should work as in Notes (correct me if I'm wrong). But how I do migrate the old Notes db to the public folder including all subfolders? Thank you!

    Read the article

  • Dangers of the pyton eval() statement

    - by LukeP
    I am creating a game. Specifically it is a pokemon battle simulator. I have an sqlite database of moves in which a row looks something like: name | type | Power | Accuracy | PP | Description However, there are some special moves. For said special moves, their damage (and other attributes not shown above, like status effects) may be dependant on certian factors. Rather than create a huge if/else in one of my classes covering the formulas for every one of these moves. I'd rather include another column in the DB that contains a formula in string form, like 'self.health/2'(simplified example). I could then just plug that into eval. I always see people saying to stay away from eval, but from what I can tell, this would be considered an acceptable use, as the dangers of eval only come into play when accepting user input. Am I correct in this assumption, or is there somthing i'm not seeing.

    Read the article

  • Suggestions for hosted file sharing services

    - by Jon
    Before I pose my question, I will give some insight as per my scenario: I work for a small business (cost is an important factor) Our bandwidth is limited and would not support an in-house FTP server We need to share files (mostly pdf, inDesign, Illustrator documents) to our clients, and as we expand, we are finding that our current locally-hosted FTP solution is too slow and is becoming a detriment to our sales team. What we need is a remotely hosted solution to share files with our clients, specifically with the following features: Greater than 100gb of secure storage The Ability to distribute unique log in credentials to clients, granting access to a personalized directory or folder, while limiting access to other files on the server. A relatively simple web-based UI for clients with limited computer knowledge We have considered a dedicated remote server, and web-based services (box.net, yousendit.com, onehub.com, filesanywhere.com) but I am unsure as per the direction we should be taking - have I left another solution out? What would you suggest? Thanks in advance.

    Read the article

< Previous Page | 41 42 43 44 45 46 47 48 49 50 51 52  | Next Page >