Search Results

Search found 33182 results on 1328 pages for 'linux port'.

Page 604/1328 | < Previous Page | 600 601 602 603 604 605 606 607 608 609 610 611  | Next Page >

  • How can I set my linux box as a router to forward ip packets?

    - by UniMouS
    I am doing a network experiment about ip packet forwarding, but I don't know why it does work. I have a linux machine with two network interfaces, eth0 and eth1 both with static IP address (eth0: 192.168.100.1, eth1: 192.168.101.2). My goal is simple, I just want to forward ip packets from eth1 with destination in subnet 192.168.100.0/24 to eth0, and forward ip packets from eth0 with destination in subnet 192.168.101.0/24 to eth1. I turned on ip forwarding with: sysctl -w net.ipv4.ip_forward=1 my routing table is like this: # route -n Kernel IP routing table Destination Gateway Genmask Flags Metric Ref Use Iface 192.168.100.0 0.0.0.0 255.255.255.0 U 0 0 0 eth0 192.168.101.0 0.0.0.0 255.255.255.0 U 0 0 0 eth1 But, when I try to ping from 192.168.100.25 to 192.168.101.47, it does not work.

    Read the article

  • Developer Preview of JDK8, JavaFX8 *HARD-FLOAT ABI* for Linux/ARM Now Available!

    - by HecklerMark
    Just a quick post to spread the good word: the Developer Preview of JDK8 and JavaFX8 for Linux on ARM processors - hard-float ABI - is now available here. Right here. It's been tested on the Raspberry Pi, and many of us plan to (unofficially) test it on a variety of other ARM platforms. This could be the beginning of something big. So...what are you still doing here? Go download it already! (Did I mention you could get it here?) :-D All the best,Mark

    Read the article

  • PHP 5.2 et 5.3 : un bug étrange rend les attaques par Déni de Service enfantines sous Windows et Linux

    PHP : un bug étrange rendrait enfantines les attaques par Déni de Service Il toucherait les versions 5.2 et 5.3 du langage sous Windows et Linux Un bug critique vient d'être découvert dans les branches 5.2 et 5.3 de PHP, le langage de programmation Web parmi les plus populaires. Ce bug est provoqué par certaines valeurs de nombres à virgule flottante ayant un nombre considérable de décimaux. Leurs calculs ou évaluations en PHP provoqueraient une boucle infinie occupant 100% des ressources du CPU. L'exécution de la ligne de code suivante, ou même son équivalent sans la notation scientifique (avec 324 décimales), provoquerait donc le plantage de la machine, et ce sous...

    Read the article

  • PHP 5.2 et 5.3 : un bug étrange rend les attaques par Déni de Service enfantine sous Windows et Linux

    PHP : un bug étrange rendrait enfantines les attaques par Déni de Service Il toucherait les versions 5.2 et 5.3 du langage sous Windows et Linux Un bug critique vient d'être découvert dans les branches 5.2 et 5.3 de PHP, le langage de programmation Web parmi les plus populaires. Ce bug est provoqué par certaines valeurs de chiffres à virgule flottante ayant un nombre considérable de décimaux. Leurs calculs ou évaluations en PHP provoqueraient une boucle infinie occupant 100% des ressources du CPU. L'exécution de la ligne de code suivante, ou même son équivalent sans la notation scientifique (avec 324 décimales), provoquerait donc le plantage de la machine, et ce sou...

    Read the article

  • Can Ubuntu be installed in a subdirectory of another Linux variant?

    - by Reid
    I have access to (but not root on) a compute server which is running a Linux distribution that is a few years old. I'd much prefer to use a current Debian-like flavor. Thus, I'm wondering if it is possible to install Ubuntu (or stock Debian) in one of my directories, and use the Ubuntu programs and libraries in preference to what comes with the server. I would need to access arbitrary parts of the server's filesystem, not just the parts under the Ubuntu install. I log in by SSH, so there's no desktop environment needed. But, I would like to be able to use X programs.

    Read the article

  • ubuntu linux updates don't update to latest version, why?

    - by djahma
    I'm using ubuntu and any now and then new updates are offered for installation. However I've noticed these updates don't necessarily offer the latest update for any one package, but why? For example, today, under ubuntu 11.04 I'm beeing offered to update the linux kernel from version 2.6.38.12 to 2.6.38.13, but on my secondary computer running xubuntu 11.10 the kernel version is already 3.0!? Other examples are the nautilus package which remains at version 2.32 in ubuntu 11.04 and which is already at version 3.2.1 on xubuntu 11.10, or mono JIT compiler sticking to 2.6.7 in 11.04 but already at 2.10.5 in 11.10. So, does anyone has a clue as to why these updates are updating my system to old versions?

    Read the article

  • Do Parallels Plesk Panel 11 have free inbuilt firewall for Debian 6 Linux OS? [closed]

    - by Sampath
    Possible Duplicate: How strong is parallels plesk 11 inbuilt firewall for linux os? I am new to Linux Dedicated server hosting and plesk panel 11 , i am looking for inbuilt firewall module . Is it comes with free in plesk 11 or i need to pay or plesk 11 doesn't support firewall?. I looked at the demo http://www.parallels.com/products/plesk/demos/, but i couldn't find any information about firewall in plesk 11.

    Read the article

  • How to install correctly another Linux flavour (in my case PCLinuxOS) together with installed Ubuntu 10.10 ?

    - by Vincenzo
    Hello everybody and Prosperous and Productive Year 2011 !!! I have Ubuntu 10.10 (32bit) installed on my laptop. I would like to install PCLinuxOS (KDE or LXDE version, I don't know yet) on the same computer across with Ubuntu 10.10. I would like to test 'in real conditions' a new PCLinuxOS as well as to resolve my question regarding Audio CD playback issue (mounting DBus timeout error). I would be grateful if somebody can advise me how to perform the installation of another Linux flavour without breakdown :) of existing Ubuntu system ? Thank you in advance for advices and recommendations. Here is my current partitioning:

    Read the article

  • updating to 3.0.0-18-generic linux kernel is causing system instability - and no backwords compatability

    - by Gaurav_Java
    I was prompted for an update of kernel 3.0.0-18-generic, so I upgraded and rebooted my system now its behaving really strange. Applications are crashing . System is randomly restarting Files are randomly disappearing off of my desktop I've tried booting into Linux kernel 3.0.0-17-generic but it doesn't help. Subsequently I am not able to connect to the INTERNET when I use the 3.0.0-17-generic kernel. Are there any ideas on what may be wrong? How I can go about debugging this?

    Read the article

  • Is there ongoing work in the kernel team to improve battery life under linux?

    - by leousa
    I have read in some forums that the kernel team is working on improving battery life and energy efficiency in linux. Unfortunately our community really lags behind windows and mac in that regard. I would like to read about the reasons why this difference exists with other platforms. Is it purely due to closed hardware specs from vendors or does it has to be with kernel design issues? Apple devices with unix cores have amazing battery times, but they also design their own hardware...just want to understand this issues in a less technical way I know that recent kernel updates in Ubuntu have improved the battery life in most computers, but I was wondering if there is still development going on and where can I read more about it. Thanks in advance

    Read the article

  • Windows Azure : Microsoft et Oracle enterrent la hache de guerre avec un accord sur Java, le SGBD, le Linux et les applications d'Oracle

    Windows Azure : Microsoft et Oracle enterrent la hache de guerre Avec un accord qui concerne Java, le SGBD, Linux, le middleware et les applications d'OracleQui l'eût cru ? Les frères ennemis Microsoft et Oracle viennent de passer un accord. Ces retombés se feront en plusieurs temps.Dès aujourd'hui, les applications certifiées par Oracle pourront tourner sur Hyper-V (l'outil de virtualisation de Windows Server) et être hébergées dans Windows Azure. En sens inverse, la plateforme Cloud de Microsoft devient un revendeur de service Cloud approuvé par Oracle (le seul avec AWS). Mais ...

    Read the article

  • Linux Mint 12 sort en version finale : Gnome 3 avec extensions, MATE et DuckDuckGo comme nouveau moteur de recherche par défaut

    Linux Mint 12 sort en version finale Gnome 3 avec extensions ou MATE, DuckDuckGo est son nouveau moteur de recherche par défaut La distro. du moment Mint vient de sortir officiellement en version 12 « Lisa». L'annonce officielle vient confirmer la disponibilité de cette édition sur plusieurs sites miroirs depuis quelques jours. Mint sort en effet dès qu'elle est prête, et n'a pas de calendrier précis comme pour l'Ubuntu dont elle dérive. Avant toute considération technique, ce lancement est marqué par un partenariat stratégique. Le m...

    Read the article

  • Is there a legit reason as to why Outlook.com premium UI is "not available" in Linux Chrome?

    - by vgaldikas
    Well if you use outlook OWA in Chrome on Ubuntu (or any Linux distro), you will get a stripped down version of it basically. You can get around it by faking your user agent, to appear that you are using FireFox. So my question is.. Is there some legitimate reason Microsoft does that, or are they just being a****s??? I mean once you have the user agent faked, it works perfectly. PS. Just in case anyone else need to use outlook, here is the command to start it with fake useragent: /opt/google/chrome/google-chrome --user-agent="Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.4 (KHTML, like Gecko) Chrome/6.0.481.0 Safari/534.4"

    Read the article

  • Linux (Ubuntu 12.10) starts up, but the main menu on the left side, and the pictograms at the upper right corner, do not show up

    - by user114220
    Linux OS (Ubuntu 12.10) starts up (user login, my personal picture of a beautiful Norwegian fjord-landscape fills the background), but the main menu on the left side, and the pictograms (battery state etc.) at the upper right corner of the screen, do not show up. This occurred yesterday quite suddenly after six or seven weeks working with Ubuntu 12.10. Then I started in recovery mode and performed a file system check (fsck) but that takes hours and finally a message ¨Sorry Ubuntu has experienced an internal error etc¨ appeared. How can I determine what has gone wrong and eventually, how can I solve it? So that I can work again using Ubuntu 12.10 (on a dual-boot system, Intel i7, ASUS P8P67 motherboard).

    Read the article

  • Does anyone know of a program like Elastics for Linux?

    - by coteyr
    On mac I would use Elastics (http://tundrabot.com/elastics) To kind of, monitor my EC2 instances. Most important to me was that it worked across several accounts and that it allowed me to connect to a running instance via ssh or RDP. This was the only real feature I care about. I can not always keep the IP address of the server that is running because ec2 is elastic, and it changes frequently. Is there any tool, like Elastics for linux/unity? Again it should support several accounts and be able to give me the IP addresses of the instance I select.

    Read the article

  • How to add authentication to ssh dynamic port forwarding?

    - by Aalex Gabi
    I am using ssh as a SOCKS server by running this command on the server: ssh -f2qTnND *:1080 root@localhost There is one problem: anybody can connect to the server and use it's internet connection. Options: To use iptables to filter access to the server, but I connect to the server from various non-statically allocated IP addresses so I would have to edit very frequently those filters which can be annoying. To install a SOCKS server on the remote. Ultimately this is the last option if there is no other simpler way to do it. (I am very lazy) Launching the same command on clients machines. The problem here is that some clients don't run on Linux and it is awkward to set up the tunnel (Windows + Putty). Is there a way to add authentication to a SOCKS server made using ssh? Bonus question: How to add encryption between the client and the server (made using ssh)?

    Read the article

  • Does anyone know about nagios plugin that uses nmap and does port checking??

    - by Eedoh
    Hi to all. I need to monitor open and closed ports on dozens of hosts. I've found a Nagios plugin that does what I need, but I would have to use this script through nrpe. Some of the hosts are powered by linux and they all have perl installed. But some of them are Windows machines, and it's not convenient for me to install perl on every one of them. That's why I can not use this plugin. I hope that there's Nagios plugin that uses nmap, or something similar, so it could check ports on every host remotely, without installing plugins on remote hosts, only on server.

    Read the article

  • How Hacker Can Access VPS CentOS 6 content?

    - by user2118559
    Just want to understand. Please, correct mistakes and write advices Hacker can access to VPS: 1. Through (using) console terminal, for example, using PuTTY. To access, hacker need to know port number, username and password. Port number hacker can know scanning open ports and try to login. The only way to login as I understand need to know username and password. To block (make more difficult) port scanning, need to use iptables configure /etc/sysconfig/iptables. I followed this https://www.digitalocean.com/community/articles/how-to-setup-a-basic-ip-tables-configuration-on-centos-6 tutorial and got *nat :PREROUTING ACCEPT [87:4524] :POSTROUTING ACCEPT [77:4713] :OUTPUT ACCEPT [77:4713] COMMIT *mangle :PREROUTING ACCEPT [2358:200388] :INPUT ACCEPT [2358:200388] :FORWARD ACCEPT [0:0] :OUTPUT ACCEPT [2638:477779] :POSTROUTING ACCEPT [2638:477779] COMMIT *filter :INPUT DROP [1:40] :FORWARD ACCEPT [0:0] :OUTPUT ACCEPT [339:56132] -A INPUT -m state --state RELATED,ESTABLISHED -j ACCEPT -A INPUT -p tcp -m tcp --tcp-flags FIN,SYN,RST,PSH,ACK,URG NONE -j DROP -A INPUT -p tcp -m tcp ! --tcp-flags FIN,SYN,RST,ACK SYN -m state --state NEW -j DROP -A INPUT -p tcp -m tcp --tcp-flags FIN,SYN,RST,PSH,ACK,URG FIN,SYN,RST,PSH,ACK,URG -j DROP -A INPUT -i lo -j ACCEPT -A INPUT -p tcp -m tcp --dport 80 -j ACCEPT -A INPUT -p tcp -m tcp --dport 110 -j ACCEPT -A INPUT -p tcp -m tcp --dport 22 -j ACCEPT -A INPUT -s 11.111.11.111/32 -p tcp -m tcp --dport 22 -j ACCEPT -A INPUT -p tcp -m tcp --dport 21 -j ACCEPT -A INPUT -s 11.111.11.111/32 -p tcp -m tcp --dport 21 -j ACCEPT COMMIT Regarding ports that need to be opened. If does not use ssl, then seems must leave open port 80 for website. Then for ssh (default 22) and for ftp (default 21). And set ip address, from which can connect. So if hacker uses other ip address, he can not access even knowing username and password? Regarding emails not sure. If I send email, using Gmail (Send mail as: (Use Gmail to send from your other email addresses)), then port 25 not necessary. For incoming emails at dynadot.com I use Email Forwarding. Does it mean that emails “does not arrive to VPS” (before arriving to VPS, emails are forwarded, for example to Gmail)? If emails does not arrive to VPS, then seems port 110 also not necessary. If use only ssl, must open port 443 and close port 80. Do not understand regarding port 3306 In PuTTY with /bin/netstat -lnp see Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name tcp 0 0 0.0.0.0:3306 0.0.0.0:* LISTEN 992/mysqld As understand it is for mysql. But does not remember that I have opened such port (may be when installed mysql, the port is opened automatically?). Mysql is installed on the same server, where all other content. Need to understand regarding port 3306 2. Also hacker may be able access console terminal through VPS hosting provider Control Panel (serial console emergency access). As understand only using console terminal (PuTTY, etc.) can make “global” changes (changes that can not modify with ftp). 3. Hacker can access to my VPS exploiting some hole in my php code and uploading, for example, Trojan. Unfortunately, faced situation that VPS was hacked. As understand it was because I used ZPanel. On VPS ( \etc\zpanel\panel\bin) ) found one php file, that was identified as Trojan by some virus scanners (at virustotal.com). Experimented with the file on local computer (wamp). And appears that hacker can see all content of VPS, rename, delete, upload etc. From my opinion, if in PuTTY use command like chattr +i /etc/php.ini then hacker could not be able to modify php.ini. Is there any other way to get into VPS?

    Read the article

  • Is there a Windows 7 compatible IPSec VPN client that allows protocol and port specific rules?

    - by Sani Huttunen
    As the title says, I need to find a IPSec VPN client for Windows 7. On XP and Vista we've used SafeNet SoftRemote in which you can set up rules for specific protocols and ports. But SoftRemote isn't compatible with Windows 7. 172.xxx.xxx.1 TCP 1433 172.xxx.xxx.2 TCP 1433 172.xxx.xxx.10 ALL ... Since the VPN gateway is configured this way the client must mirror these settings. I've tried TheGreenBow, NCP Secure Entry, Cisco VPN Client and Shrew Soft VPN but none of these allows you to configure by protocol and port. Does anyone have any other suggestions? EDIT: Forgot to mention that agressive mode is also a requirement. --UPDATE-- I've got some news... I've managed to get SoftRemote to work on Windows 7 x64 through Windows XP Mode. After scouring all corners of the Internet for idéas I had enough information to construct a working solution. This solution will probably benefit other clients as well! You'll find a post here with detailed instructions of how I went about.

    Read the article

  • Why does windows (file) explorer try to connect to port 80 (http) instead just using smb?

    - by Erik
    Background: On an almost freshly installed pc I get a message along the lines of : "windows cannot find some-file-server-name. Check the spelling and try again"... when trying to access any fileshare. Troubleshooting so far: pinging works. Both by ip and by name the almost identical pc next to this one can access the file server everyone else can access the file server the pc in question can not access other open fileshares but it can connect to the internet And now for what I think is the interesting part: running wireshark with ip.addr == local.ip.add.ress and ip.addr == server.ip.add.ress tells me that it tries to connect over http. the server replies but after a few messages back and forth it stops the other machine of course just uses smb I guess port 80 just means it defaults to webdav, but I haven't been able to find anything that can cause this. Googling it the closest thing I found was this http://www.techrepublic.com/article/get-vista-and-samba-to-work/6353849 but then again this was an XP pc and I wasn't able to connect to other native Windows shares (and I tried the solution anyway and it didn't work.)

    Read the article

  • Node.js Adventure - Host Node.js on Windows Azure Worker Role

    - by Shaun
    In my previous post I demonstrated about how to develop and deploy a Node.js application on Windows Azure Web Site (a.k.a. WAWS). WAWS is a new feature in Windows Azure platform. Since it’s low-cost, and it provides IIS and IISNode components so that we can host our Node.js application though Git, FTP and WebMatrix without any configuration and component installation. But sometimes we need to use the Windows Azure Cloud Service (a.k.a. WACS) and host our Node.js on worker role. Below are some benefits of using worker role. - WAWS leverages IIS and IISNode to host Node.js application, which runs in x86 WOW mode. It reduces the performance comparing with x64 in some cases. - WACS worker role does not need IIS, hence there’s no restriction of IIS, such as 8000 concurrent requests limitation. - WACS provides more flexibility and controls to the developers. For example, we can RDP to the virtual machines of our worker role instances. - WACS provides the service configuration features which can be changed when the role is running. - WACS provides more scaling capability than WAWS. In WAWS we can have at most 3 reserved instances per web site while in WACS we can have up to 20 instances in a subscription. - Since when using WACS worker role we starts the node by ourselves in a process, we can control the input, output and error stream. We can also control the version of Node.js.   Run Node.js in Worker Role Node.js can be started by just having its execution file. This means in Windows Azure, we can have a worker role with the “node.exe” and the Node.js source files, then start it in Run method of the worker role entry class. Let’s create a new windows azure project in Visual Studio and add a new worker role. Since we need our worker role execute the “node.exe” with our application code we need to add the “node.exe” into our project. Right click on the worker role project and add an existing item. By default the Node.js will be installed in the “Program Files\nodejs” folder so we can navigate there and add the “node.exe”. Then we need to create the entry code of Node.js. In WAWS the entry file must be named “server.js”, which is because it’s hosted by IIS and IISNode and IISNode only accept “server.js”. But here as we control everything we can choose any files as the entry code. For example, I created a new JavaScript file named “index.js” in project root. Since we created a C# Windows Azure project we cannot create a JavaScript file from the context menu “Add new item”. We have to create a text file, and then rename it to JavaScript extension. After we added these two files we should set their “Copy to Output Directory” property to “Copy Always”, or “Copy if Newer”. Otherwise they will not be involved in the package when deployed. Let’s paste a very simple Node.js code in the “index.js” as below. As you can see I created a web server listening at port 12345. 1: var http = require("http"); 2: var port = 12345; 3:  4: http.createServer(function (req, res) { 5: res.writeHead(200, { "Content-Type": "text/plain" }); 6: res.end("Hello World\n"); 7: }).listen(port); 8:  9: console.log("Server running at port %d", port); Then we need to start “node.exe” with this file when our worker role was started. This can be done in its Run method. I found the Node.js and entry JavaScript file name, and then create a new process to run it. Our worker role will wait for the process to be exited. If everything is OK once our web server was opened the process will be there listening for incoming requests, and should not be terminated. The code in worker role would be like this. 1: public override void Run() 2: { 3: // This is a sample worker implementation. Replace with your logic. 4: Trace.WriteLine("NodejsHost entry point called", "Information"); 5:  6: // retrieve the node.exe and entry node.js source code file name. 7: var node = Environment.ExpandEnvironmentVariables(@"%RoleRoot%\approot\node.exe"); 8: var js = "index.js"; 9:  10: // prepare the process starting of node.exe 11: var info = new ProcessStartInfo(node, js) 12: { 13: CreateNoWindow = false, 14: ErrorDialog = true, 15: WindowStyle = ProcessWindowStyle.Normal, 16: UseShellExecute = false, 17: WorkingDirectory = Environment.ExpandEnvironmentVariables(@"%RoleRoot%\approot") 18: }; 19: Trace.WriteLine(string.Format("{0} {1}", node, js), "Information"); 20:  21: // start the node.exe with entry code and wait for exit 22: var process = Process.Start(info); 23: process.WaitForExit(); 24: } Then we can run it locally. In the computer emulator UI the worker role started and it executed the Node.js, then Node.js windows appeared. Open the browser to verify the website hosted by our worker role. Next let’s deploy it to azure. But we need some additional steps. First, we need to create an input endpoint. By default there’s no endpoint defined in a worker role. So we will open the role property window in Visual Studio, create a new input TCP endpoint to the port we want our website to use. In this case I will use 80. Even though we created a web server we should add a TCP endpoint of the worker role, since Node.js always listen on TCP instead of HTTP. And then changed the “index.js”, let our web server listen on 80. 1: var http = require("http"); 2: var port = 80; 3:  4: http.createServer(function (req, res) { 5: res.writeHead(200, { "Content-Type": "text/plain" }); 6: res.end("Hello World\n"); 7: }).listen(port); 8:  9: console.log("Server running at port %d", port); Then publish it to Windows Azure. And then in browser we can see our Node.js website was running on WACS worker role. We may encounter an error if we tried to run our Node.js website on 80 port at local emulator. This is because the compute emulator registered 80 and map the 80 endpoint to 81. But our Node.js cannot detect this operation. So when it tried to listen on 80 it will failed since 80 have been used.   Use NPM Modules When we are using WAWS to host Node.js, we can simply install modules we need, and then just publish or upload all files to WAWS. But if we are using WACS worker role, we have to do some extra steps to make the modules work. Assuming that we plan to use “express” in our application. Firstly of all we should download and install this module through NPM command. But after the install finished, they are just in the disk but not included in the worker role project. If we deploy the worker role right now the module will not be packaged and uploaded to azure. Hence we need to add them to the project. On solution explorer window click the “Show all files” button, select the “node_modules” folder and in the context menu select “Include In Project”. But that not enough. We also need to make all files in this module to “Copy always” or “Copy if newer”, so that they can be uploaded to azure with the “node.exe” and “index.js”. This is painful step since there might be many files in a module. So I created a small tool which can update a C# project file, make its all items as “Copy always”. The code is very simple. 1: static void Main(string[] args) 2: { 3: if (args.Length < 1) 4: { 5: Console.WriteLine("Usage: copyallalways [project file]"); 6: return; 7: } 8:  9: var proj = args[0]; 10: File.Copy(proj, string.Format("{0}.bak", proj)); 11:  12: var xml = new XmlDocument(); 13: xml.Load(proj); 14: var nsManager = new XmlNamespaceManager(xml.NameTable); 15: nsManager.AddNamespace("pf", "http://schemas.microsoft.com/developer/msbuild/2003"); 16:  17: // add the output setting to copy always 18: var contentNodes = xml.SelectNodes("//pf:Project/pf:ItemGroup/pf:Content", nsManager); 19: UpdateNodes(contentNodes, xml, nsManager); 20: var noneNodes = xml.SelectNodes("//pf:Project/pf:ItemGroup/pf:None", nsManager); 21: UpdateNodes(noneNodes, xml, nsManager); 22: xml.Save(proj); 23:  24: // remove the namespace attributes 25: var content = xml.InnerXml.Replace("<CopyToOutputDirectory xmlns=\"\">", "<CopyToOutputDirectory>"); 26: xml.LoadXml(content); 27: xml.Save(proj); 28: } 29:  30: static void UpdateNodes(XmlNodeList nodes, XmlDocument xml, XmlNamespaceManager nsManager) 31: { 32: foreach (XmlNode node in nodes) 33: { 34: var copyToOutputDirectoryNode = node.SelectSingleNode("pf:CopyToOutputDirectory", nsManager); 35: if (copyToOutputDirectoryNode == null) 36: { 37: var n = xml.CreateNode(XmlNodeType.Element, "CopyToOutputDirectory", null); 38: n.InnerText = "Always"; 39: node.AppendChild(n); 40: } 41: else 42: { 43: if (string.Compare(copyToOutputDirectoryNode.InnerText, "Always", true) != 0) 44: { 45: copyToOutputDirectoryNode.InnerText = "Always"; 46: } 47: } 48: } 49: } Please be careful when use this tool. I created only for demo so do not use it directly in a production environment. Unload the worker role project, execute this tool with the worker role project file name as the command line argument, it will set all items as “Copy always”. Then reload this worker role project. Now let’s change the “index.js” to use express. 1: var express = require("express"); 2: var app = express(); 3:  4: var port = 80; 5:  6: app.configure(function () { 7: }); 8:  9: app.get("/", function (req, res) { 10: res.send("Hello Node.js!"); 11: }); 12:  13: app.get("/User/:id", function (req, res) { 14: var id = req.params.id; 15: res.json({ 16: "id": id, 17: "name": "user " + id, 18: "company": "IGT" 19: }); 20: }); 21:  22: app.listen(port); Finally let’s publish it and have a look in browser.   Use Windows Azure SQL Database We can use Windows Azure SQL Database (a.k.a. WACD) from Node.js as well on worker role hosting. Since we can control the version of Node.js, here we can use x64 version of “node-sqlserver” now. This is better than if we host Node.js on WAWS since it only support x86. Just install the “node-sqlserver” module from NPM, copy the “sqlserver.node” from “Build\Release” folder to “Lib” folder. Include them in worker role project and run my tool to make them to “Copy always”. Finally update the “index.js” to use WASD. 1: var express = require("express"); 2: var sql = require("node-sqlserver"); 3:  4: var connectionString = "Driver={SQL Server Native Client 10.0};Server=tcp:{SERVER NAME}.database.windows.net,1433;Database={DATABASE NAME};Uid={LOGIN}@{SERVER NAME};Pwd={PASSWORD};Encrypt=yes;Connection Timeout=30;"; 5: var port = 80; 6:  7: var app = express(); 8:  9: app.configure(function () { 10: app.use(express.bodyParser()); 11: }); 12:  13: app.get("/", function (req, res) { 14: sql.open(connectionString, function (err, conn) { 15: if (err) { 16: console.log(err); 17: res.send(500, "Cannot open connection."); 18: } 19: else { 20: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 21: if (err) { 22: console.log(err); 23: res.send(500, "Cannot retrieve records."); 24: } 25: else { 26: res.json(results); 27: } 28: }); 29: } 30: }); 31: }); 32:  33: app.get("/text/:key/:culture", function (req, res) { 34: sql.open(connectionString, function (err, conn) { 35: if (err) { 36: console.log(err); 37: res.send(500, "Cannot open connection."); 38: } 39: else { 40: var key = req.params.key; 41: var culture = req.params.culture; 42: var command = "SELECT * FROM [Resource] WHERE [Key] = '" + key + "' AND [Culture] = '" + culture + "'"; 43: conn.queryRaw(command, function (err, results) { 44: if (err) { 45: console.log(err); 46: res.send(500, "Cannot retrieve records."); 47: } 48: else { 49: res.json(results); 50: } 51: }); 52: } 53: }); 54: }); 55:  56: app.get("/sproc/:key/:culture", function (req, res) { 57: sql.open(connectionString, function (err, conn) { 58: if (err) { 59: console.log(err); 60: res.send(500, "Cannot open connection."); 61: } 62: else { 63: var key = req.params.key; 64: var culture = req.params.culture; 65: var command = "EXEC GetItem '" + key + "', '" + culture + "'"; 66: conn.queryRaw(command, function (err, results) { 67: if (err) { 68: console.log(err); 69: res.send(500, "Cannot retrieve records."); 70: } 71: else { 72: res.json(results); 73: } 74: }); 75: } 76: }); 77: }); 78:  79: app.post("/new", function (req, res) { 80: var key = req.body.key; 81: var culture = req.body.culture; 82: var val = req.body.val; 83:  84: sql.open(connectionString, function (err, conn) { 85: if (err) { 86: console.log(err); 87: res.send(500, "Cannot open connection."); 88: } 89: else { 90: var command = "INSERT INTO [Resource] VALUES ('" + key + "', '" + culture + "', N'" + val + "')"; 91: conn.queryRaw(command, function (err, results) { 92: if (err) { 93: console.log(err); 94: res.send(500, "Cannot retrieve records."); 95: } 96: else { 97: res.send(200, "Inserted Successful"); 98: } 99: }); 100: } 101: }); 102: }); 103:  104: app.listen(port); Publish to azure and now we can see our Node.js is working with WASD through x64 version “node-sqlserver”.   Summary In this post I demonstrated how to host our Node.js in Windows Azure Cloud Service worker role. By using worker role we can control the version of Node.js, as well as the entry code. And it’s possible to do some pre jobs before the Node.js application started. It also removed the IIS and IISNode limitation. I personally recommended to use worker role as our Node.js hosting. But there are some problem if you use the approach I mentioned here. The first one is, we need to set all JavaScript files and module files as “Copy always” or “Copy if newer” manually. The second one is, in this way we cannot retrieve the cloud service configuration information. For example, we defined the endpoint in worker role property but we also specified the listening port in Node.js hardcoded. It should be changed that our Node.js can retrieve the endpoint. But I can tell you it won’t be working here. In the next post I will describe another way to execute the “node.exe” and Node.js application, so that we can get the cloud service configuration in Node.js. I will also demonstrate how to use Windows Azure Storage from Node.js by using the Windows Azure Node.js SDK.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Why can't I build Deluge?

    - by hugemeow
    Deluge is a BitTorrent Client. I am trying to build it from source, since I don't have privilege to install it as root. I am using python setup.py build. But, it failed following message, why? copying deluge/ui/web/themes/images/gray/slider/slider-v-thumb.png -> build/lib.linux-x86_64-2.4/deluge/ui/web/themes/images/gray/slider copying deluge/ui/web/themes/images/gray/slider/slider-thumb.png -> build/lib.linux-x86_64-2.4/deluge/ui/web/themes/images/gray/slider copying deluge/ui/web/themes/images/gray/panel/top-bottom.png -> build/lib.linux-x86_64-2.4/deluge/ui/web/themes/images/gray/panel copying deluge/ui/web/themes/images/gray/tabs/tab-strip-bg.png -> build/lib.linux-x86_64-2.4/deluge/ui/web/themes/images/gray/tabs copying deluge/ui/web/themes/images/yourtheme/window/right-corners.png -> build/lib.linux-x86_64-2.4/deluge/ui/web/themes/images/yourtheme/window copying deluge/ui/web/themes/images/yourtheme/window/left-corners.png -> build/lib.linux-x86_64-2.4/deluge/ui/web/themes/images/yourtheme/window copying deluge/ui/web/themes/images/yourtheme/window/left-right.png -> build/lib.linux-x86_64-2.4/deluge/ui/web/themes/images/yourtheme/window copying deluge/ui/web/themes/images/yourtheme/window/top-bottom.png -> build/lib.linux-x86_64-2.4/deluge/ui/web/themes/images/yourtheme/window creating build/lib.linux-x86_64-2.4/deluge/ui/web/themes/images/yourtheme/slider copying deluge/ui/web/themes/images/yourtheme/slider/slider-v-thumb.png -> build/lib.linux-x86_64-2.4/deluge/ui/web/themes/images/yourtheme/slider copying deluge/ui/web/themes/images/yourtheme/slider/slider-thumb.png -> build/lib.linux-x86_64-2.4/deluge/ui/web/themes/images/yourtheme/slider copying deluge/ui/web/themes/images/yourtheme/slider/slider-bg.png -> build/lib.linux-x86_64-2.4/deluge/ui/web/themes/images/yourtheme/slider copying deluge/ui/web/themes/images/yourtheme/slider/slider-v-bg.png -> build/lib.linux-x86_64-2.4/deluge/ui/web/themes/images/yourtheme/slider copying deluge/ui/web/themes/images/yourtheme/panel/top-bottom.png -> build/lib.linux-x86_64-2.4/deluge/ui/web/themes/images/yourtheme/panel copying deluge/ui/web/themes/images/yourtheme/grid/hmenu-lock.png -> build/lib.linux-x86_64-2.4/deluge/ui/web/themes/images/yourtheme/grid copying deluge/ui/web/themes/images/yourtheme/grid/hmenu-unlock.png -> build/lib.linux-x86_64-2.4/deluge/ui/web/themes/images/yourtheme/grid copying deluge/ui/web/themes/images/yourtheme/tabs/tab-strip-bg.png -> build/lib.linux-x86_64-2.4/deluge/ui/web/themes/images/yourtheme/tabs running build_ext building 'libtorrent' extension gcc -pthread -shared -L/usr/lib64 -L/opt/local/lib -lboost_filesystem -lboost_date_time -lboost_iostreams -lboost_python -lboost_thread -lpthread -lssl -lz -o build/lib.linux-x86_64-2.4/deluge/libtorrent.so /usr/bin/ld: cannot find -lboost_filesystem collect2: ld returned 1 exit status error: command 'gcc' failed with exit status 1 [mirror@innov deluge-1.3.5]$ echo $? 1 Edit 1: gcc version and os information $(which gcc) --version gcc (GCC) 4.1.2 20080704 (Red Hat 4.1.2-52) Copyright (C) 2006 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. cat /etc/issue CentOS release 5.7 (Final) Kernel \r on an \m Edit 2: boost is referenced by setup.py in deluge 114 if OS == "linux": 115 if os.path.exists(os.path.join(sysconfig.get_config_vars()['LIBDIR'], \ 116 'libboost_filesystem-mt.so')): 117 boost_filesystem = "boost_filesystem-mt" 118 elif os.path.exists(os.path.join(sysconfig.get_config_vars()['LIBDIR'], \ 119 'libboost_filesystem.so')): 120 boost_filesystem = "boost_filesystem" 121 if os.path.exists(os.path.join(sysconfig.get_config_vars()['LIBDIR'], \ 122 'libboost_date_time-mt.so')): 123 boost_date_time = "boost_date_time-mt" 124 elif os.path.exists(os.path.join(sysconfig.get_config_vars()['LIBDIR'], \ 125 'libboost_date_time.so')): 126 boost_date_time = "boost_date_time" 127 if os.path.exists(os.path.join(sysconfig.get_config_vars()['LIBDIR'], \ 128 'libboost_thread-mt.so')): 129 boost_thread = "boost_thread-mt" 130 elif os.path.exists(os.path.join(sysconfig.get_config_vars()['LIBDIR'], \ 131 'libboost_thread.so')): 132 boost_thread = "boost_thread" 133 134 if 'boost_filesystem' not in vars(): 135 boost_filesystem = "boost_filesystem-mt" 136 if 'boost_date_time' not in vars(): 137 boost_date_time = "boost_date_time-mt" 138 if 'boost_thread' not in vars(): 139 boost_thread = "boost_thread-mt" 140 141 elif OS == "freebsd": 142 boost_filesystem = "boost_filesystem" 143 boost_date_time = "boost_date_time" 144 boost_thread = "boost_thread" 145 else: 146 boost_filesystem = "boost_filesystem-mt" 147 boost_date_time = "boost_date_time-mt" 148 boost_thread = "boost_thread-mt" 149 150 librariestype = [boost_filesystem, boost_date_time, 151 boost_thread, 'z', 'pthread', 'ssl', 'crypto']

    Read the article

< Previous Page | 600 601 602 603 604 605 606 607 608 609 610 611  | Next Page >