Search Results

Search found 22866 results on 915 pages for 'ftp client'.

Page 357/915 | < Previous Page | 353 354 355 356 357 358 359 360 361 362 363 364  | Next Page >

  • Should I encrypt data in database?

    - by Tio
    I have a client, for which I'm going to do an Web application about patient care, managing patients, consults, history, calendars, everything about that basically. The problem is that this is sensitive data, patient history and such. The client insists on encrypting the data at the database level, but I think this is going to deteriorate the performance of the web app. ( But maybe I shouldn't be worried about this ) I've read the laws about data protection on health issues ( Portugal ), but isn't very specific about this ( I just questioned them about this, I'm waiting for their response ). I've read the following link, but my question is different, should I encrypt the data in the database, or not. One problem that I foresee in encrypting data, is that I'm going to need a key, this could be the user password, but we all know how user passwords are ( 12345 etc etc ), and generating a key I would have to store it somewhere, this means that the programmer, dba, whatever could have access to it, any thoughts on this? Even adding an random salt to the user password isn't going to solve the problem since I can always access it, and therefore decrypt the data.

    Read the article

  • Can't use HTTPS with ServerXMLHTTP object

    - by Imraan
    I am supporting a Classic ASP application that connects to a payment gateway via HTTPS. Up until recently there have been no issues. A few days ago this broke without the code, IIS config or anything local changing. Its broken on at least 3 separate servers. The last run of Windows Updates was in late November, but bringing the servers' updates up date has not resolved the problem. A code snippet is below. Dim oHttp Dim strResult Set oHttp = CreateObject("MSXML2.ServerXMLHTTP") oHttp.setOption 2, 13056 oHttp.open "POST", SOAP_ENDPOINT, false oHttp.setRequestHeader "Content-Type", "application/soap+xml; charset=utf-8" oHttp.setRequestHeader "SOAPAction", SOAP_NS + "/" & SOAP_FUNCTION oHttp.send SOAP_REQUEST Below is a dump of the error object :- Number: -2147012852 Description: A certificate is required to complete client authentication Message: A certificate is required to complete client authentication I initially posted the question on Stackoverflow (http://stackoverflow.com/questions/9212985/cant-use-https-with-serverxmlhttp-object) thinking it was a code issue, but further investigation seems to point to a server issue.

    Read the article

  • How do I daemonify my daemon?

    - by jonobacon
    As part of the Ubuntu Accomplishments system I have a daemon that runs as well as a client that connects to it. The daemon is written in Python (using Twisted) and provides a dbus service and a means of processing requests from the clients. Right now the daemon is just a program I run before I run the client and it sets up the dbus service and provides an API that can be used by the clients. I want to transform this into something that can be installed and run as a system service for the user's session (e.g. starting on boot) and providing a means to start and stop it etc. The problem is, I am not sure what I need to do to properly daemonify it so it can run as this service. I wanted to ask if others can provide some guidance. Some things I need to ask: How can I treat it as a service that is run for the current user service (not a system service right now)? How do I ensure I can start, stop, and restart this session service? When packaging this, how do I ensure that it installs it as a service for the user's session and is started on login etc? In responding, if you can point me to specific examples or solutions I need to implement, that would be helpful. :-) Thanks! Jono

    Read the article

  • How do I use a URL path instead of a file path in an Open File dialog in Mac OSX or ChromiumOS?

    - by Chris
    In Windows 7 (and perhaps earlier), the default "Open File" dialog box allows you to type a full URL into the "File name" section as if it were a file path, e.g. "http://www.example.com/pic.gif" instead of "C:/windows/pictures/pic.gif". When uploading a file to a website on the client side - say, an image - this allows the client to upload a picture located on a server accessible via the URL instead of downloading the image, saving it locally, then referencing the local image in the "Open File" dialog. It's a great option for Windows users. I have three separate questions: What is this procedure formally called? How do I describe this succinctly so that my searches for more information are fruitful? Can something similar be done in Mac OSX, Chromium OS, or a Linux environment? If so, how? Thanks!

    Read the article

  • Firestarted Blocking DHCP?

    - by Chiggins
    Alright so on my Ubuntu laptop I get a wireless internet signal, and then distribute it to a switch where my other computers are connected. I also have Firestarter installed on the laptop. I have a problem where the only way that one of my client computers try to get an IP and such from my laptop, it can't because Firestarter is blocking it somehow. I have to stop the firewall in Firestarter, then connect from a client. It connects right away, then I turn Firestarter back on. In the preferences there is an option in "Firewall" called "Start/restart firewall on DHCP lease", but if I have that checked or not doesn't make a difference. So, what can I do to fix this? Its kinda annoying to have to do this whenever I have to connect to the internet. Thanks!

    Read the article

  • How to properly uninstall/reinstall Ubuntu One on Windows XP?

    - by user73303
    I had previously installed an Ubuntu One client as a test on a Windows XP machine. Now I wanted to change the account for the client to a production one but had problems changing the email address so decided to do a reinstall. Ran uninstall. Downloaded ubuntuone-3.0.2-windows-installer.exe. It downloads, goes through the unpacking/install – strangely some of the messages say updating as if it was replacing something that was already there. I do not get the setup/signin screen. There is no ubuntu% processes running. The Program files/ubuntuone directory exists with data and dist folders. The U icon is on the desktop – pointing at ubuntuone/dist/ubuntuone-control-panel-qt.exe but this does not run. Ran uninstall again, deleted Program files/ubuntuone directory, removed any ubuntu entries for registry, rebooted. Downloaded install again - exactly the same as above. How can I uninstall Ubuntu One to get a clean reinstall? Or force the install to continue after downloading/unpacking?

    Read the article

  • Remote access and local access same hostname

    - by cpf
    Hi serverfault, I have a server in a clients network, seperated from theirs with a router/firewall, the intention is to have this server available through one hostname (example.com) My idea is to have (at least) a DNS server in the outside, to have outside (out of the clients' network) access the internal server. The problem would at that point be the internal client (PC A) My question: What would I have to do to make something like this work? Is it even possible or already done? The goal is to not have to change anything on either PC A or PC B, while both should access the same "internal server" while surfing to "example.com" Perhaps adding logic to the DNS server would work (Detect the external IP of internal client [PC A] is the same as the IP for example.com - Give the local IP as reply?) Anyhow: Thanks for helping me think on this!

    Read the article

  • How can I unify my email, calendar and tasks (2 exchange accounts + 1 gmail)

    - by Assaf Stone
    This is my situation: I work as a consultant, and thus work out of multiple computers: my work-laptop a desktop at my primary client my desktop at home an android smartphone an android tablet Likewise, I have multiple accounts: A Microsoft Exchange (2010 AFAIK) account A Microsoft Exchange (2007 AFAIK) account A gmail account The most important thing I need is the ability to have events in one calendar affect the free / busy status of all other accounts (so that if I am busy on Monday 9am with an event from my employer's account, it will show that time as busy in my client's account, and in the gmail account. Second thing I need is a unified view of all of my accounts' info: Appointments, email, tasks, and contacts (in that order of importance). I've already tried outlook synchronization tools such as gSyncit, to sync both exchange accounts with gmail, but this creates a mess when updating appointments (deleted appointments sometimes return, timestamps revert). Is there perhaps some way to at least synchronize the free/busy state in a way that all of my calendar apps / accounts will look there to see if I can be invited? Just solving that would be well worth my while. Thanks, Assaf

    Read the article

  • Wakanda 3 : nouvel éditeur d'UI pour le Touch et le Mobile, la plateforme JavaScript de développement en démo au JS.everywhere()

    4D lance la préversion de Wakanda Une plateforme de développement et de déploiement d'applications 100 % JavaScript L'éditeur d'outils de développement français 4D vient de lancer une version preview pour développeur de sa solution Wakanda. Wakanda est la première plateforme « end-to-end » dédiée au développement d'applications Web totalement en JavaScript, côté client comme côté serveur. Ces applications sont ensuite accessibles depuis n'importe quel navigateur de bureau ou mobile. Wakanda est donc un environnement intégré de développement et de déploiement d'applications métiers sur internet utilisant du JavaScript à 100 %. [IMG]http://ftp-developpez.com/gordo...

    Read the article

  • Is Samba "remote browse sync" possible across OpenVPN tunnel?

    - by John Reynolds
    I'm connecting 2 TomatoUSB (Shibby build on WNR3500L v2) routers with an OpenVPN routed connection: ----------------------- ----------------------- | Router 1, subnet 20 | <--tunnel--> | Router 2, subnet 21 | ----------------------- ----------------------- Router 1 is the OpenVPN server and Router 2 is a client. Clients attached to the routers on both subnets can ping clients on the other subnet, so the tunnel and routing works. I've enabled file sharing on both, in order to get their Samba WINS servers running. Is it possible to get name resolution across the tunnel? I've tried remote browse sync = 192.168.21.1 in /etc/smb.conf on the server side, to no avail. Also tried using the IP adress that the client gets from the OpenVPN address pool (usually 10.8.0.something), but still no joy.

    Read the article

  • Android 2.2 repéré sur le Net, la prochaine version de l'OS mobile de Google pourrait arriver offici

    Android 2.2 repéré sur le Net La prochaine version de l'OS mobile de Google pourrait arriver officiellement en mai "Froyo", nom de code de la future version d'Android, a été repéré sur le Net. Google serait donc en train de tester Android 2.2 en interne. C'est en tout cas ce que croit savoir Android and Me, un site spécialisé - entre autre - dans la mesure du trafic d'Android. [IMG]http://ftp-developpez.com/gordon-fowler/android22.png[/IMG] Cette nouvelle version proposera un compilateur JIT (pour Just In Time), une méthode qui permet la compilation en temps réel d'un code vers le code natif de l'appareil et de garder le résultat en ...

    Read the article

  • Touch Mouse, la souris tactile multi-point de Microsoft issue d'un programme R&D futuriste bientôt commercialisée pour Windows 7

    Touch Mouse, la souris tactile multi-point de Microsoft Issue de son programme R&D avant-gardiste bientôt commercialisée À l'occasion du salon international CES 2011 de Las Vegas, Microsoft vient de présenter une souris baptisée Touch Mouse. Dotée d'un design assez avant-gardiste, cette souris est un périphérique tactile multipoint conçue pour Windows 7. Elle est née du projet Mouse 2.0 de Microsoft Research et de l'Applied Sciences Group. [IMG]http://ftp-developpez.com/gordon-fowler/Touch-Mouse.jpg[/IMG] « La nouvelle Touch Mouse est parfaite pour interagir de manière plus naturelle avec un PC sous Windows 7 », explique Ma...

    Read the article

  • Why does this service refuse to start on Windows server 2003?

    - by PenguinCoder
    We have a Windows 2003 server with Cebos MQ1 (ver. 7 and ver. GRI) products installed that have been operational for years. After installing Microsoft 2010 C++ Redistributable package needed for other development, the MQ1 GRI service now fails to start. Event logs showed that two additional updates (.NET4 and the 2010 C++ Redistributable SP2) where installed by the redistributable as well. As soon as we discovered the MQ1 service was not starting properly, we removed these three installed packages. However the service still does not start; the dialog that pops up states 'The service started then stopped. '. Event logs when we attempt to start the service show nothing; IE: No errors, crashes, failures, or other information related to this service. Executing the MQ1Serv.exe directly specifies an issue of 'Missing command line operation, must specify install, uninstall and company abbreviation.' sc query MQ1Service(GRI) shows a clean exit for the Win32ExitCode of 0x0. Attempting to reinstall the client or server software gives an error of 'The procedure entry point ReInitializeCriticalSection could not be located in the dynamic link library KERNEL32.dll.' at the 'Registering Libraries' stage. At this point, further research has stated that the required function is in URL.dll and to verify the library is not corrupted. Running an sfc /scannow on the server has replaced a few DLLS; including the URL.DLL to versions from 2005. This actually broke other applications which required a reinstall (one of them being IE 7). After reinstall and updates, url.dll version is 7.0.5730.13 (2009) and Kernel32.dll is version 5.2.3790.4480 (2009). The MQ1 GRI service still will not start, specifying the same error as previous 'Service started then stopped'. Running a disassembler on Kernel32.dll and Url.dll show no functions named ReinitializeCriticalSection. Attempting the reinstall of the MQ1 client and server as well as starting the service again, fails once more. However, setting the compatibility mode on the MQ1 client install exe to 'Windows 95' actually gets the program to install. Setting the compatibility mode on the MQ1 server service does not enable it to start. I have been researching this problem for nearly a week and besides the advice to scan and replace url.dll, have come to no successful conclusions. This service was operational prior to the 2010 C++ install, without any additional parameters or settings. After removing the C++ install and all servicepacks/updates it installed silently, still does not correct the issue of the MQ1 GRI service not starting. Q: Has anyone else run into this or similar issue while attempting to get a service initialized? What have I overlooked or what else can I try in order to get this service started??

    Read the article

  • Etes-vous satisfait de Free Mobile ? Participez à notre sondage et aidez-nous à mieux évaluer la qualité du service du nouvel opérateur

    Etes-vous entièrement satisfait de Free Mobile ? Participez à notre sondage pour mieux évaluer la qualité du service du nouvel opérateur Free Mobile fonctionne. C'est un fait. Mais Free Mobile fonctionne quelquefois mal. Les uns diront qu'il faut lui laisser du temps. Les autres que Xavier Niel a fait trop de promesses, trop tôt. [IMG]http://ftp-developpez.com/gordon-fowler/FreeMobile/Free%20Mobile%202.png[/IMG] Loin de nous l'idée de trancher cette question. Mais une anecdote, survenue hier soir, nous a poussé à vous demander voter avis sur le sujet. Abonné à Free Mobile, un membre de la rédaction a essayé de passer un appel à u...

    Read the article

  • Apache certificates for some urls not working

    - by Vegaasen
    We are having a rather strange problem with a Apache-installation. Here is a short summary: Currently I'm setting up Apache with https, and server-certificates. This is fairly easy and works straight out of the box - as expected. This is the configuration for this setup: Listen 443 SSLEngine on SSLCertificateFile "/progs/apache/ssl/example-site.no.pem" SSLCertificateKeyFile "/progs/apache/ssl/example-site.no.key" SSLCACertificateFile "/progs/apache/ssl/ca/example_root.pem" SSLCADNRequestFile "/progs/apache/ssl/ca/example_intermediate.pem" SSLVerifyClient none SSLVerifyDepth 3 SSLOptions +StdEnvVars +ExportCertData RequestHeader set ssl-ClientCert-Subject-CN "%{SSL_CLIENT_S_DN}s" RewriteEngine On ProxyPreserveHost On ProxyRequests On SSLProxyEngine On ... <LocationMatch /secureStuff/$> SSLVerifyClient require Order deny,allow Allow from All </LocationMatch> ... <Proxy balancer://exBalancer> Header add Set-Cookie "EX_ROUTE=EB.%{BALANCER_WORKER_ROUTE}e; path=/" env=BALANCER_ROUTE_CHANGED BalancerMember http://10.0.0.1:7200 route=ee1 retry=300 flushpackets=off keepalive=on BalancerMember http://10.0.0.2:7200 route=ee2 retry=300 flushpackets=off keepalive=on status=+H ProxySet stickysession=EX_ROUTE scolonpathdelim=Off timeout=10 nofailover=off failonstatus=505 maxattempts=1 lbmethod=bybusyness Order deny,allow Allow from all </Proxy> RewriteCond %{REQUEST_URI} !^/index.html [NC] RewriteRule ^/(.*)$ balancer://exBalancer/$1 [P,NC] ProxyPassReverse / balancer://exBalancer/ Header edit Set-Cookie "(.*)" "$1;HttpsOnly" ... So - everything works fine and as expected for all of the pages that are not a part of the LocationMatch-directive. When requesting something that matches the LocationMatch-directive, I'm asked for a certificate (hence the SSLVerifyClient required attribute) - and getting all the correct certificates in my browser that is based on the root/intermediate chain. After choosing a certificate and clicking "OK", this is what pops up in the apache logs: [ssl:info] [pid 9530:tid 25] [client :43357] AH01998: Connection closed to child 86 with abortive shutdown ( [Thu Oct 11 09:27:36.221876 2012] [ssl:debug] [pid 9530:tid 25] ssl_engine_io.c(1171): (70014)End of file found: [client 10.235.128.55:45846] AH02007: SSL handshake interrupted by system [Hint: Stop button pressed in browser?!] And this just spams the logs. What is happening here? I can see this configuration working on my local machine, but not on one of our servers. There is no configration differences between the servers, only minor application-wise-changes. I've tried the following: 1) Removing CA-certificate-checking (works) 2) Adding required CA-certificate for the whole site (works) 3) Adding "SSLVerifyClient optional" does not work 4) ++ Server/Application Information Local: -OpenSSL v.1.0.1x -Apache 2.4.3 -Ubuntu -mpm: event -every configuration should be turned on (failing) server: -OpenSSL 0.9.8e -Apache 2.4.2 -SunOS -mpm: worker -every configuration should be turned on Please let me know if more information is needed, I'll provide it instantly. Brief sum-up: -Running apache 2.4 -Server certificates works just fine -Client certificates for some /Locations does not work, fails with errors PS: Could it be related with the OpenSSL version and the "Renegotiation" stuff related to TLS/SSLv3?

    Read the article

  • Input prediction and server re-simultaion

    - by Lope
    I have read plenty of articles about multiplayer principles and I have basic client-server system set up. There is however one thing I am not clear on. When player enters input, it is sent to the server and steps back in time to check if what should have happened at the time of that input and it resimulates the world again. So far everything's clear. All articles took shooting as an example, because it is easy to explain and it is pretty straightforward, but I believe movement is more complicated. Imagine following situation: 2 players move towards each other. A------<------B Player A stops halfway towards the collision point, but there is lag spike so the command does not arrive on the server for a second or so. Current state of the world on the server (and on the other clients as well) at the time when input arrives is this: [1]: -------AB------- The command arrives and we go back in time and re-simulate the world, the result is this: [2]: ---AB----------- Player A sees situation [2] which is correct, but the player is suddenly teleported from the position in [1] (center) to the position in [2]. Is this how this is supposed to work? Point of the client prediction is to give lagged player feeling that everything is smooth, not to ruin experience for other players. Alternative is to discard timestamp on the player's input and handle it when it arrives on the server without going back in time. This, however, creates even more severe problems for lagged player (even if he is lagging just a bit)

    Read the article

  • Remote Desktop Encryption

    - by Kumar
    My client is RDP 6.1 (On Windows XP SP3) and Server is Windows Server 2003. I have installed an SSL certificate on server for RDP. In the RDP settings (General tab), the Encryption method is set to SSL/TLS 1.0 and Encryption level is set to "Client Compatible". I have following questions In this case is it guaranteed that all communication is encrypted even when I remote login to the server? I mean pwd is encrypted Does RDP always use some kind of encryption even if there is no SSL certificate installed on the server? In this case I do not see security lock in the connection bar. When I set encryption level to "High" then I see security lock. I do believe that communication is both cases will be encrypted. Is it true? Please reply to my questions Thanks in advance Kumar

    Read the article

  • Un bug de la barre de recherche de Chrome 20 fait croire à un malware, comment y remédier

    Un bug de la barre de recherche de Chrome 20 fait croire à un malware Comment y remédier Depuis sa version 20, Chrome contient un bug assez ennuyeux : la barre de recherche mène automatiquement sur une page blanche. Le 1er réflexe de nombreux utilisateurs a été de penser que ce comportement était dû à un nouveau malware. En fait, il n'en est rien. Si vous faîtes partie de ceux qui se sont inquiétés, rassurez vous. La solution temporaire est très simple ? bien que laborieuse. Il suffit de supprimer "blank.html" de l'URL générée. Cette solution a été trouvée après 116 messages sur le Google Group dédié au problème. [IMG]http://ftp-developpez...

    Read the article

  • Emploi : l'APEC met en avant Next Step <sup>3 ans</sup>, son service au succès croissant pour les cadres qui ressentent un besoin de changement

    Emploi : l'APEC met en avant Next Step 3 ans Son service au succès croissant pour les cadres qui ressentent un besoin de changement « Besoin de faire le point ou intéressé par une nouvelle orientation ? ». Si vous répondez par l'affirmative à cette question de l'APEC, l'association se propose de vous accompagner dans votre analyse et dans votre décision. L'Association Pour l'Emploi des Cadres a mis au point un service qui permet d'analyser en profondeur sa situation professionnelle et de concevoir un projet d'évolution avec un consultant. [IMG]http://ftp-developpez.com/gordon-fowler/Next%20Step%203%20a...

    Read the article

  • Chrome 22 disponible : une version dédiée aux jeux 3D en ligne, API Pointer Lock et améliorations pour Windows 8 et écrans Retina

    Chrome 22 disponible : une version destinée aux FPS et jeux 3D en ligne Support de l'API « Pointer Lock » et améliorations pour Windows 8 et écrans Retina Google ne cesse d'améliorer son navigateur qui devient une véritable plateforme polyvalente. La firme met à jour son navigateur Web Chrome afin qu'il soit plus exploitable par les adeptes et développeurs de FPS (jeux de tirs subjectifs) et de jeux 3D en ligne. [IMG]http://ftp-developpez.com/gordon-fowler/Chrome%20Logo.png[/IMG] Google vient de sortir une nouvelle version "majeure" et stable de Chrome, 22 au compteur. Celle-ci inclut désormais le support pour l'API JavaScript « Pointer...

    Read the article

  • Is there an open source solution that I can host on a web server that will allow users to anonymously upload a file to me?

    - by mjn12
    I'm looking for some kind of web application I can host on my Linux web server that will allow users to upload files of arbitrary size to me from their browser without requiring them to log in. Ideally this application would allow me to generate a link to my website that allowed for a one-time use upload. It might contain a unique, random key that was only good for that session. I could email them the link, they click it and are taken to a page where they can upload their file to me. I'm mainly targeting friends and family that need to send me files that are too large for email. I don't want to require them to install anything (dropbox), sign up and log in, etc. I'm definitely not teaching them to use FTP. This wouldn't be a difficult project for me to roll on my own but I'd like to take something off the shelf if it is possible. Does anything like this exist that my google-foo isn't turning up?

    Read the article

  • PuTTY automatically supply password

    - by Kyle Cronin
    I have a situation where I need to have PuTTY (or another SSH client for Windows) automatically log into another machine via SSH. I realize that this isn't a good idea security-wise, but unfortunately I'm constrained by the limitations both on the client and the server. The best solution would be to have a shortcut or script on the desktop that, when double clicked, will connect to the server and automatically log in. Can I do this with PuTTY? I am willing to explore public key authentication, but I'm not sure where the PuTTY key resides or how to copy it to the server, as the app starts automatically upon login.

    Read the article

  • Slow WLAN file transfer between server and tablet

    - by user266985
    My file server is running Ubuntu 12.04 and I'm sharing files from it over samba. It is connected via gigabit ethernet. My desktop, running Windows 8.1, is also connected via gigabit ethernet. I can transfer files between the two and completely saturate that gigabit pipe. However, I just got a Surface Pro 2, and I'm trying to stream HD movies from my server to the device over WiFi. For some reason, I can't break much past 1.5MB/s transferring files over the network. I've tried streaming through XBMC and a standard file copy; no difference. To add the confusion, if I connect to my guest network and then use my VPN server (installed on the router) to access the file server, I get around 3.2MB/s. I've been running diagnostics to determine the root and I think I've found it but I have no idea what is causing it or how to fix it. Router: Asus RT-N66U Surface Pro 2 Network Card: Marvell Avastar 350N (Driver 19/09/2013 v14.69.24044.150) InSSIDer: Link Score: 100 Co-Channels: 0 Overlapping: 0 5GHz Network Channel: 48+44 iperf File Server as Server; Surface Pro 2 as Client - TCP Performance: Acceptable ------------------------------------------------------------ Server listening on TCP port 5001 TCP window size: 85.3 KByte (default) ------------------------------------------------------------ [ 4] local 192.168.0.90 port 5001 connected with 192.168.0.56 port 57367 [ ID] Interval Transfer Bandwidth [ 4] 0.0- 1.0 sec 10.1 MBytes 84.7 Mbits/sec [ 4] 1.0- 2.0 sec 10.4 MBytes 87.6 Mbits/sec [ 4] 2.0- 3.0 sec 10.6 MBytes 88.8 Mbits/sec [ 4] 3.0- 4.0 sec 10.7 MBytes 89.5 Mbits/sec [ 4] 4.0- 5.0 sec 10.1 MBytes 84.4 Mbits/sec [ 4] 5.0- 6.0 sec 10.2 MBytes 85.8 Mbits/sec [ 4] 6.0- 7.0 sec 7.04 MBytes 59.1 Mbits/sec [ 4] 7.0- 8.0 sec 10.8 MBytes 90.2 Mbits/sec [ 4] 8.0- 9.0 sec 10.6 MBytes 89.1 Mbits/sec [ 4] 9.0-10.0 sec 8.62 MBytes 72.3 Mbits/sec [ 4] 0.0-10.0 sec 99.2 MBytes 83.1 Mbits/sec iperf Surface Pro 2 as Server, File Server as Client Performance: Poor ------------------------------------------------------------ Client connecting to 192.168.0.56, TCP port 5001 TCP window size: 22.9 KByte (default) ------------------------------------------------------------ [ 3] local 192.168.0.90 port 40233 connected with 192.168.0.56 port 5001 [ ID] Interval Transfer Bandwidth [ 3] 0.0- 1.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 1.0- 2.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 2.0- 3.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 3.0- 4.0 sec 1.25 MBytes 10.5 Mbits/sec [ 3] 4.0- 5.0 sec 1.62 MBytes 13.6 Mbits/sec [ 3] 5.0- 6.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 6.0- 7.0 sec 1.38 MBytes 11.5 Mbits/sec [ 3] 7.0- 8.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 8.0- 9.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 9.0-10.0 sec 1.62 MBytes 13.6 Mbits/sec [ 3] 0.0-10.1 sec 15.0 MBytes 12.4 Mbits/sec For some reason, it gets capped and I haven't got a clue why. Any suggestions? Edit: My link speed is reported as 270Mbps by Windows. I'm less than two metres from the router with a clear line of sight.

    Read the article

  • Looking for 'WinHlp32.exe compatible' replacement for free redistribution under vista and windows 7

    - by richardboon
    Our software installs a package of legacy software for the client, some of it has old hlp file from 3rd party vendor requiring winhlp32.exe (note: we have no legal right to modify the hlp). Those client may only have cd/dvd and might not have internet access, etc. So I need a free 'WinHlp32.exe compatible' replacement for our redistribution under vista and windows 7. Background of problem: -Microsoft stopped including the 32-bit Help file viewer in Windows releases beginning with Windows Vista and Windows Server 2008. -Starting with the release of Windows Vista and Windows Server 2008, third-party software developers are no longer authorized to redistribute WinHlp32.exe with their programs. http://support.microsoft.com/kb/917607

    Read the article

  • How to make sure clients update their browser cache when my website is updated?

    - by user64204
    I am using the HTTP 1.1 Cache-Control header to implement client-side caching. Since I update my website only once a month I would like the CSS and JS files to be cached for 30 days with Cache-Control: max-age=2592000. The problem is that the 30-day period defined by Cache-Control doesn't coincide with the website update cycle, it starts from the moment the users visit the site and ends 30 days later, which means an update could occur in the meantime and users would be running with outdated content for a while, which could break the rendering of the website if for instance the HTML and CSS no longer match. How can I perform client-side caching of content for periods of several days but somehow get users to refresh their CSS/JS files after the website has been updated? One solution I could think of is that if website updates can be schedule, the max-age returned by the server could be decreased every day accordingly so that no matter when people visit the website, the end of caching period would coincide with the update of the website, but changing the server configuration every day goes against one of my sysadmin principles (once it's running, don't touch it).

    Read the article

< Previous Page | 353 354 355 356 357 358 359 360 361 362 363 364  | Next Page >