Search Results

Search found 7665 results on 307 pages for 'knockout js'.

Page 35/307 | < Previous Page | 31 32 33 34 35 36 37 38 39 40 41 42  | Next Page >

  • Validating button click event using JS from inside ascx nested inside updatepanel

    - by Viswa
    Hello I have a button inside an ascx inside an update panel inside aspx content page. When the button is clicked i want it to run a JS function that causes to show a panel. Here is my Code. <pre> <%@ Control Language="C#" AutoEventWireup="true" CodeBehind="ABC.ascx.cs" Inherits="App.ABC" %> <script type= "text/javascript" language="javascript"> var val1=0; var val2=0; function ShowPanel(val2) { if(val2 != 0) { switch(val2) { case 1 : document.getElementById('<%=pnl1.ClientID%>').style.visibility = 'visible'; break; } } return false; } </script> <asp:LinkButton ID="lbl1" runat="server" OnClick="return ShowPanel(1);">count</asp:LinkButton> I am not sue how to do this. Please help Update #1 - ABC.ascx is in updatepanel in the aspx page XYZ.aspx <%@ Control Language="C#" AutoEventWireup="true" CodeBehind="ABC.ascx.cs" Inherits="App.ABC" %> <script type= "text/javascript" language="javascript"> var val1=0; var val2=0; function ShowPanel(val2) { if (val2 != 0) { switch (val2) { case 1: document.getElementById("<%= this.pnl1.ClientID%>").style.display = "none"; break; } } return false; } </script> <div> <div style="text-align:center"> </div> <table style="width:100%; text-align:center; border-color:#99CCFF" border="3"> <tr style="text-align:left"> <td><asp:LinkButton ID="lbl1" runat="server" OnClientClick="return ShowPanel(1);">count</asp:LinkButton> </td> <td style="text-align:right"><asp:Button ID="btnHide1" runat="server" Text="hide" Height="18px" Width="32px"/> </td> </tr> <tr> <td colspan="2"><asp:Panel ID="pnl1" runat="server" Visible="false"> </asp:Panel> </td> </tr> </table> </div>

    Read the article

  • Backbone Model fetched from Lithium controller is not loaded properly in bb Model

    - by Nilesh Kale
    I'm using backbone.js and Lithium. I'm fetching a model from the server by passing in a _id that is received as a hidden parameter on the page. The database MongoDB has stored the data correctly and can be viewed from console as: { "_id" : ObjectId("50bb82694fbe3de417000001"), "holiday_name" : "SHREE15", "description": "", "star_rating" : "3", "holiday_type" : "family", "rooms" : "1", "adults" : "2", "child" :"0", "emails" : "" } The Lithium Model class is so: class Holidays extends \lithium\data\Model { public $validates = array( 'holiday_name' => array( array( 'notEmpty', 'required' => true, 'message' => 'Please key-in a holiday name! (eg. Family trip for summer holidays)' ))); } The backbone Holiday model is so: window.app.IHoliday = Backbone.Model.extend({ urlRoot: HOLIDAY_URL, idAttribute: "_id", id: "_id", // Default attributes for the holiday. defaults: { }, // Ensure that each todo created has `title`. initialize: function(props) { }, The code for backbone/fetch is: var Holiday = new window.app.IHoliday({ _id: holiday_id }); Holiday.fetch( { success: function(){ alert('Holiday fetched:' + JSON.stringify(Holiday)); console.log('HOLIDAY Fetched: \n' + JSON.stringify(Holiday)); console.log('Holiday name:' + Holiday.get('holiday_name')); } } ); Lithium Controller Code is: public function load($holiday_id) { $Holiday = Holidays::find($holiday_id); return compact('Holiday'); } PROBLEM: The output of the backbone model fetched from server is as below and the Holiday model is not correctly 'formed' when data returns into backbone Model: HOLIDAY Fetched: {"_id":"50bb82694fbe3de417000001","Holiday":{"_id":"50bb82694fbe3de417000001","holiday_name":"SHREE15","description":"","star_rating":"3","holiday_type":"family","rooms":"1","adults":"2","child":"0","emails":""}} iplann...view.js (line 68) Holiday name:undefined Clearly there is some issue when the data is passed/translated from Lithium and loaded up as a model into backbone Holiday model. Is there something very obviously wrong in my code?

    Read the article

  • Using multiple named outlets and a wrapper view with no content in Emberjs

    - by user1889776
    I'm trying to use multiple named outlets with Ember.js. Is my approach below correct? Markup: <script type="text/x-handlebars" data-template-name="application"> <div id="mainArea"> {{outlet main_area}} </div> </script> <script type="text/x-handlebars" data-template-name="home"> <ul id="sections"> {{outlet sections}} </ul> <ul id="categories"> {{outlet categories}} </ul> </script> <script type="text/x-handlebars" data-template-name="sections"> {{#each section in controller}} <li><img {{bindAttr src="section.image"}}></li> {{/each}} </script> <script type="text/x-handlebars" data-template-name="categories"> {{#each category in controller}} <img {{bindAttr src="category.image"}}> {{/each}} </script>? JS Code: Here I set the content of the various controllers to data grabbed from a server and connect outlets with their corresponding views. Since the HomeController has no content, set its content to an empty object - a hack to get the rid of this error message: Uncaught Error: assertion failed: Cannot delegate set('categories' ) to the 'content' property of object proxy : its 'content' is undefined. App.Router = Ember.Router.extend({ enableLogging: false, root: Ember.Route.extend({ index: Ember.Route.extend({ route: '/', connectOutlets: function(router){ router.get('sectionsController').set('content',App.Section.find()); router.get('categoriesController').set('content', App.Category.find()); router.get('applicationController').connectOutlet('main_area', 'home'); router.get('homeController').connectOutlet('home', {}); router.get('homeController').connectOutlet('categories', 'categories'); router.get('homeController').connectOutlet('sections', 'sections'); } }) }) });

    Read the article

  • error on connecting to the server : socket io is not defined

    - by max
    i know there's been couple of question about the same problem , i've already check them . i have very simple node.js chat app i have a server running on 8000 port and it works fine my client pages are html , they are running on apache and i'm using socket.io to connect them to the server and it works fine on the local host but when i upload the app on the server i keep on getting this error in the firebug io is not defined var socket = io.connect('http://atenak.com:8000/'); or sometimes it doesn't show that but when i try to broadcast message from cliend i get this error : socket is undefined socket.emit('msg', { data: msg , user:'max' }); the only difference is i've changed the localhost with atenak.com ! here is my html code var socket = io.connect('http://atenak.com:8000/'); var user = 'jack'; socket.on('newmsg', function (data) { if(data.user == user ) { $('#container').html(data.data); } }); function brodcast(){ var msg = $('#fild').val(); socket.emit('msg', { data: msg , user:'max' }); } </script> </head> <body> <div id="container"> </div> <input id="fild" type="text"> <input name="" type="button" onClick="brodcast();"> </body> i have included the sockt.io.js and server is running ok which means socket.io is installed on the server here is the live page http://atenak.com/client.html

    Read the article

  • Defining an implementation independent version of the global object in JavaScript

    - by Aadit M Shah
    I'm trying to define the global object in JavaScript in a single line as follows: var global = this.global || this; The above statement is in the global scope. Hence in browsers the this pointer is an alias for the window object. Assuming that it's the first line of JavaScript to be executed in the context of the current web page, the value of global will always be the same as that of the this pointer or the window object. In CommonJS implementations, such as RingoJS and node.js the this pointer points to the current ModuleScope. However, we can access the global object through the property global defined on the ModuleScope. Hence we can access it via the this.global property. Hence this code snippet works in all browsers and in at least RingoJS and node.js, but I have not tested other CommomJS implementations. Thus I would like to know if this code will not yield correct results when run on any other CommonJS implementation, and if so how I may fix it. Eventually, I intend to use it in a lambda expression for my implementation independent JavaScript framework as follows (idea from jQuery): (function (global) { // javascript framework })(this.global || this);

    Read the article

  • Node-webkit works on Mac, crashes and can't load module on Windows

    - by user756201
    I've created a full node-webkit app that works fine on the Mac OSX version of node-webkit. Everything works, it loads a key external nodeJS module (marked), and the world is good. However, when I try to run the app on the Windows version of Node-webkit as described in the Wiki, the app crashes immediately (in fact, it crashes immediately when I try all the options: dragging a folder onto nw.exe, dragging an app.nw compressed folder, and running both from the command line). The only thing that gets me closer is opening nw.exe and then pointing the node-webkit location bar to the index file. Then I get this error: Uncaught node.js Error Error: Cannot find module 'marked' I tried commenting out the code that requires marked: var marked = require('marked'); That returns the app to crashing immediately. I assumed it was because of context issues between node.js and the node-webkit browser, but those seem to not be at fault since I tried this suggestion to make sure it finds the correct file for the marked module...went right back to immediate crashing. I'm out of ideas because the crashes don't seem to leave me any way of knowing what the error was.

    Read the article

  • Using Everyauth/Express and Multiple Configurations?

    - by Zane Claes
    I'm successfully using Node.js + Express + Everyauth ( https://github.com/abelmartin/Express-And-Everyauth/blob/master/app.js ) to login to Facebook, Twitter, etc. from my application. The problem I'm trying to wrap my head around is that Everyauth seems to be "configure and forget." I set up a single everyauth object and configure it to act as middleware for express, and then forget about it. For example, if I want to create a mobile Facebook login I do: var app = express.createServer(); everyauth.facebook .appId('AAAA') .appSecret('BBBB') .entryPath('/login/facebook') .callbackPath('/callback/facebook') .mobile(true); // mobile! app.use(everyauth.middleware()); everyauth.helpExpress(app); app.listen(8000); Here's the problem: Both mobile and non-mobile clients will connect to my server, and I don't know which is connecting until the connection is made. Even worse, I need to support multiple Facebook app IDs (and, again, I don't know which one I will want to use until the client connects and I partially parse the input). Because everyauth is a singleton which in configured once, I cannot see how to make these changes to the configuration based upon the request that is made. What it seems like is that I need to create some sort of middleware which acts before the everyauth middleware to configure the everyauth object, such that everyauth subsequently uses the correct appId/appSecret/mobile parameters. I have no clue how to go about this... Suggestions? Here's the best idea I have so far, though it seems terrible: Create an everyauth object for every possible configuration using a different entryPath for each...

    Read the article

  • How to debug a Gruntfile with breakpoints using node-inspector?

    - by Kris Hollenbeck
    So I have spent the past couple days trying to get this to work with no luck. Most of the solutions I have found seem to work "okay" for debugging node applications. But I haven't had much luck debugging grunt stand alone. I would like to be able to set breakpoints in my gruntfile and either step through the code with either the browser or an IDE. I have tried the following: Debugging using intelliJ IDE Using Grunt Console (Process finished with exit code 6) Debugging with Nodeeclipse (This sort of works okay but doesn't hit the breakpoints set in eclipse, not very intuitive) Debugging using node-inspector (This one also sort of works. I can step through a little ways using F11 and F10 in chrome. But eventually it just crashes. Using F8 to skip to break point never works.) ERROR MESSAGE USING NODE-INSPECTOR So currently node-inspector feels like it has gotten me the closest to what I want. To get here I did the following: From my grunt directory I ran the following commands: grunt node-inspector node --debug-brk Gruntfile.js And then from there I went to localhost:8080/debug?port=5858 to debug my Gruntfile.js. But like I mentioned above, as soon as I hit F8 to skip to breakpoint it crashes with the above error. Has anybody had any success using this method to try to debug a Gruntfile? So far from my search efforts I have not found a very well documented way of doing this. So hopefully this will be useful or beneficial information for future users. Also I am using Windows 7 by the way. Thanks in advance.

    Read the article

  • [jQuery] What would be the best way to perform a basic CRUD using AJAX

    - by rasouza
    I'm having trouble to make a simple CRUD in my site. I have a table of registries <table> <tbody> <?php foreach ($row as $reg) { ?> <tr <?php if ($reg['value'] < 0) { echo "class='error'"; } ?>> <td><?php echo $reg['creditor'] ?></td> <td><?php echo $reg['debtor'] ?></td> <td><?php echo $reg['reason'] ?></td> <td>R$ <?php echo number_format(abs($reg['value']), 2, ',', ' ')?></td> <td><a **href="<?php echo $this->baseUrl(); ?>/history/delete/id/<?php echo $reg['id']; ?>"** class="delete"><img src="http://192.168.0.102/libraries/css/blueprint/plugins/buttons/icons/cross.png" alt=""/></a></td> </tr> <?php } ?> </tbody> </table> which I would like to perform a simple delete in these rows using AJAX (preferenciably with jQuery). The question is: do I have to create a function in JS and add onmouseclick event in the HTML? is there a more consistent way for doing this, like adding $('.delete').click() directly in the js file? If so, how do I pass the row ID for the ajax function? What I really want is to know how to pass the row ID to $.ajax() jQuery function through a clean! way

    Read the article

  • Problems building nodejs on MacOS Snow Leopard

    - by mrwooster
    I am having trouble building nodejs on MacOS Snow Leopard. I think it might have something to do with my PATH variable not being set correctly for the developer tools location. For some reason, the Developer tools (gcc, g++, make etc) are all stored in /Developer/usr/bin I added it to my PATH variable as follows: $ export PATH=$PATH:/Developer/usr/bin $ echo $PATH /opt/local/bin:/opt/local/sbin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin:/usr/local/git/bin:/usr/X11/bin:/Developer/usr/bin When i try to configure it complains about not finding open-ssl, ok, not a big problem. So I try with --without-ssl : $ ./configure --without-ssl Checking for program g++ or c++ : /Developer/usr/bin/g++ Checking for program cpp : /Developer/usr/bin/cpp Checking for program ar : /usr/bin/ar Checking for program ranlib : /Developer/usr/bin/ranlib Checking for g++ : ok Checking for program gcc or cc : /Developer/usr/bin/gcc Checking for gcc : ok Checking for library dl : yes Checking for library util : yes Checking for library rt : not found --- libeio --- Checking for library pthread : yes Checking for function pthread_create : not found /Users/Guy/git_src/node/node/deps/libeio/wscript:13: error: the configuration failed (see '/Users/Guy/git_src/node/node/build/config.log') Anyone know how I can get round this? I am suspicious that it might be something to do with the PATH or another ENV variable, but not sure. Thanks G

    Read the article

  • route port 3000 to apache2 alias

    - by user223470
    I have a meteor application running on port 3000. I can successfully connect to the program with www.myurl.com:3000, but would rather connect to it via www.myurl.com/myappname. I started with the instructions on this web site: http://www.andrehonsberg.com/article/deploy-meteorjs-vhosts-ubuntu1204-mongodb-apache-proxy and I have the following Apache configuration file: <VirtualHost *:80> ServerName myurl.com ProxyRequests off <Proxy *> Order deny,allow Allow from all </Proxy> <Location /> ProxyPass http://localhost:3000/ ProxyPassReverse http://localhost:3000/ </Location> </VirtualHost> I do not know how to continue from here to get the program on www.mysite.com/myapp. In other situations, I would use an Alias within the Apache configuration file, but that doesn't seem like the right direction to go in this case. How do I configure Apache to send port 3000 to www.myurl.com/myapp?

    Read the article

  • Database OR Array

    - by rezoner
    What is the exact point of using external database system if I have simple relations (95% querries are dependant on ID). I am storing users and their stats. Why would I use external database if I can have neat constructions like: db.users[32] = something Array of 500K users is not that big effort for RAM Pros are: no problematic asynchronity (instant results) easy export/import dealing with database like with a native object LITERALLY ps. and considerations: Would it be faster or slower to do collection[3] than db.query("select ... I am going to store it as a file/s There is only ONE application/process accessing this data, and the code is executed line by line - please don't elaborate about locking. Please don't answer with database propositions but why to use external DB over native array/object - I have experience in a few databases - that's not the case. What I am building is a client/gateway/server(s) game. Gateway deals with all users data, processing, authenticating, writing statistics e.t.c No other part of software needs to access directly to this data/database.

    Read the article

  • Where does Rundesk execute local tasks from

    - by Leon Stafford
    I'm trying to interact with the nodejs Azure sdk from a CentOS installation of Rundeck. If I try from the "run" adhoc virtual shell, I am able to after running azure account import <mykey> and can then also execute other Azure commands inside of jobs if I set them as Rundeck node tasks and not selecting "dispatch to nodes" in the job settings. Trying to run the Azure sdk commands as commands to be dispatched to the node (local) fails with the error: localhost1-NodeDispatch-localexec 04:53:04 /usr/bin/env: node: No such file or directory 04:53:04 Failed: NonZeroResultCode: Result code was 127 I am not able to "jumpstart" the same environment by running azure account import <mykey> I am assuming this is a permissions/environmental issue, though not sure how to fix it. UPDATE: Executing whoami from the same job returns rundeck, so I assume I will need to either modify that to execute tasks as my system user or grant permissions to get the rundeck user into the node environment the Azure sdk is running in?

    Read the article

  • Mongodb: why is my mongo server using two PID's?

    - by Lucas
    I started my mongo with the following command: [lucas@ecoinstance]~/node/nodetest2$ sudo mongod --dbpath /home/lucas/node/nodetest2/data 2014-06-07T08:46:30.507+0000 [initandlisten] MongoDB starting : pid=6409 port=27017 dbpat h=/home/lucas/node/nodetest2/data 64-bit host=ecoinstance 2014-06-07T08:46:30.508+0000 [initandlisten] db version v2.6.1 2014-06-07T08:46:30.508+0000 [initandlisten] git version: 4b95b086d2374bdcfcdf2249272fb55 2c9c726e8 2014-06-07T08:46:30.508+0000 [initandlisten] build info: Linux build14.nj1.10gen.cc 2.6.3 2-431.3.1.el6.x86_64 #1 SMP Fri Jan 3 21:39:27 UTC 2014 x86_64 BOOST_LIB_VERSION=1_49 2014-06-07T08:46:30.509+0000 [initandlisten] allocator: tcmalloc 2014-06-07T08:46:30.509+0000 [initandlisten] options: { storage: { dbPath: "/home/lucas/n ode/nodetest2/data" } } 2014-06-07T08:46:30.520+0000 [initandlisten] journal dir=/home/lucas/node/nodetest2/data/ journal 2014-06-07T08:46:30.520+0000 [initandlisten] recover : no journal files present, no recov ery needed 2014-06-07T08:46:30.527+0000 [initandlisten] waiting for connections on port 27017 It appears to be working, as I can execute mongo and access the server. However, here are the process running mongo: [lucas@ecoinstance]~/node/testSite$ ps aux | grep mongo root 6540 0.0 0.2 33424 1664 pts/3 S+ 08:52 0:00 sudo mongod --dbpath /ho me/lucas/node/nodetest2/data root 6541 0.6 8.6 522140 52512 pts/3 Sl+ 08:52 0:00 mongod --dbpath /home/lu cas/node/nodetest2/data lucas 6554 0.0 0.1 7836 876 pts/4 S+ 08:52 0:00 grep mongo As you can see, there are two PID's for mongo. Before I ran sudo mongod --dbpath /home/lucas/node/nodetest2/data, there were none (besides the grep of course). How did my command spawn two PID's, and should I be concerned? Any suggestions or tips would be great. Additional Info In addition, I may have other issues that might suggest a cause. I tried running mongo with --fork --logpath /home/lucas..., but it did not work. More information below: [lucas@ecoinstance]~/node/nodetest2$ sudo mongod --dbpath /home/lucas/node/nodetest2/data --fork --logpath /home/lucas/node/nodetest2/data/ about to fork child process, waiting until server is ready for connections. forked process: 6578 ERROR: child process failed, exited with error number 1 [lucas@ecoinstance]~/node/nodetest2$ ls -l data/ total 163852 drwxr-xr-x 2 mongodb nogroup 4096 Jun 7 08:54 journal -rw------- 1 mongodb nogroup 67108864 Jun 7 08:52 local.0 -rw------- 1 mongodb nogroup 16777216 Jun 7 08:52 local.ns -rwxr-xr-x 1 mongodb nogroup 0 Jun 7 08:54 mongod.lock -rw------- 1 mongodb nogroup 67108864 Jun 7 02:08 nodetest1.0 -rw------- 1 mongodb nogroup 16777216 Jun 7 02:08 nodetest1.ns Also, my db path folder is not the original location. It was originally created under the default /var/lib/mongodb/ and moved to my local data folder. This was done after shutting down the server via /etc/init.d/mongod stop. I have a Debian Wheezy server, if it matters.

    Read the article

  • how to install npm if couldn't resolve npmjs.org

    - by Rahul Mehta
    when m doing curl it says could not resolve host what can i do ? curl http://npmjs.org/install.sh | sudo sh curl: (6) Couldn't resolve host 'npmjs.org' http://npmjs.org/ /etc/resolv.conf search x1 nameserver x2 nameserver 8.8.8.8 nameserver 8.8.4.4 nslookup result nslookup google.com Server: x1 Address: x1#53 Non-authoritative answer: *** Can't find google.com: No answer Non-authoritative answer: * Can't find google.com: No answer

    Read the article

  • Nginx, HAproxy, Unicorn, Rails and Node settings

    - by Julien Genestoux
    Our application is currently only a "regular" web app, with no fancy things like streaming HTTP or websockets. It's mostly a Rails app, served by a few (20 on 2 machines) Unicorn workers, proxied by a venerable nginx server which deals with load balancing. This has been working quite well for the past year and the app now serves between 400 and 800 requests per second at any point during the day. We're soon releasing 2 new APIs, which are both served by a Node application : a websocket one, as well as a long polling HTTP one. (the fancy thing like the Twitter streaming API where HTTP connections never end). They both use the same port on node and since the node app is stateless, we can certainly deploy a few of them to handle the traffic. The app (node) is now deployed in 5 instances and are now listening on 5 different 'private' ports on the same host. We need to put something in front of them to load balance, but also something that is able to deal with sockets (either websocket or HTTP streaming) which are intended to stay 'up' for days. The question is then : what? I read somewhere that HAProxy does a better job than Nginx at this. What do you recommend?

    Read the article

  • Content Length and Transfer Encoding Chunked nginx, node-http-proxy

    - by rampr
    I have the following setup - node-http-proxy acts as a reverse proxy forwarding all requests to nginx/socket.io as necessary My problem is this When I send a HTTP DELETE request from the browser, node-http-proxy adds a header "Transfer Encoding Chunked" as the request from the browser had no Content Length. The request from the browser had no Content Length as it had no body. Nginx doesn't like the Transfer Encoding Chunked Header and throws a 411 asking for Content-Length. The problem gets solved when I send dummy data as part of the DELETE request so there is a Content Length and node-http-proxy doesn't add Transfer Encoding Chunked header and nginx is happy. I want to understand if node-http-proxy isn't working as expected, because it adds a Transfer Encoding Chunked header when Content Length is missing because there is no Content Body.

    Read the article

  • gcc sandboxing tool - AppArmor / CHROOT jail on Ubuntu 12.04

    - by StuR
    We have a Node application as the front end to a C++ sandboxing tool, which compiles code using gcc and outputs the result to the browser. e.g. exec("gcc -o /tmp/test /tmp/test.cpp", function (error, stdout, stderr) { if(!stderr) { execFile('/tmp/test', function(error, stdout, stderr) {}); } }); This works fine. However, as you can imagine this is a security nightmare if it were to be made public - so I was thinking of two options to protect my stack: 1) A CHROOT jail - but this in itself wouldn't be enough to prevent directory traversal / file access. 2) AppArmor ? So my question is really, how could I protect my stack from any nasties that could come from: A) Compiling unknown code using gcc B) Executing the compiled code

    Read the article

  • Elastic beanstalk access private git repo

    - by user221676
    I am trying to currently add an ssh key to my elastic beanstalk instances using .ebextensions commands. The keys I have stored are in my application code and I try to copy them to the root .ssh folder so I can access them when doing a git+ssh clone later here is an example of the config file in my .ebextensions folder packages: yum: git: [] container_commands: 01-move-ssh-keys: command: "cp .ssh/* ~root/.ssh/; chmod 400 ~root/.ssh/tca_read_rsa; chmod 400 ~root/.ssh/tca_read_rsa.pub; chmod 644 ~root/.ssh/known_hosts;" 02-add-ssh-keys: command: "ssh-add ~root/.ssh/tca_read_rsa" the problem is that I get is an error when attempting to clone the repo Host key verification failed. I have tried many ways of try to add the host to the known_hosts file but none have worked! The command that is doing the clone is npm install as the repo points to a node module

    Read the article

  • Having Troubles Getting My Apache Server Online(NodeJS and Apache)

    - by Jeff Armingol
    I am new here. This is my situation. I am using nodejs modules, serialport2 and socket.io, because I am trying to forward the data from my arduino hardware through serialports. In my server side script, I read the data then forward it to the client side. Now I am using Apache to serve the html page,which is the client side. I am running Nodejs on port 8000 and Apache on port 80. It is running OKAY when I view it in my browser typing localhost:80. The data is appearing and seems fine. Now when I tried to get my Apache server online using a Free DDNS provider(http://www.noip.com/) and my port80, it loaded the webpage but there are no data appearing on the page. What seems to be the problem here? Really need your expertise and advice. Thanks in advanced!

    Read the article

  • nodejs server hanging from time to time

    - by Johann Philipp Strathausen
    I have a node server (0.6.6) running an Express application, along with Mongoose and s3, on an Ubuntu 11.04 machine. Several times per hour, the server is hanging. That means that the application is working fine, I see the express loggings, and then all of a sudden the server stops responding. No errors, no traces, no loggings, and strangely enough the browser won't show the request even in the network debugging window. From any machine in the local network it's the same behaviour. I restart the server and it's okay again for several minutes, then again starts to hang, everytime while doing something different. The same application on Amazon on the same Ubuntu version works fine and never hangs. I know all this is kind of vague, but I don't know where to start. Has any of you seen something like this before? Any idea?

    Read the article

  • is there a way to automate changing filenames in <link> , <script> tags

    - by nepsdotin
    when we use Expires header for text files like js, css, contents are cached in the browser, to get new content we need to change in the html file the new names in the link and script tag. When we add changes. How can we automate it. I may have some bunch of html files in multiple folders also in subdirectories. There would be a text file filelist.txt OldName NewName oldfile1-ver-1.0.js oldfile1-ver-2.0.js oldfile2-ver-1.0.js oldfile2-ver-2.0.js oldfile3-ver-1.0.js oldfile3-ver-2.0.js oldfile4-ver-1.0.js oldfile4-ver-2.0.js The script should change all the oldfile1-ver-1.0.js into oldfile1-ver-2.0.js in the html, php files I would run this script before i start uploading. Finally the script could create a list of files and line number where it made the update. The solution can be in PERL/PHP/BATCH or anything thats nice and elegant

    Read the article

  • Chef bash resource not executing as specified user

    - by Arthur Maltson
    I'm writing a Chef cookbook to install Hubot. In the recipe, I do the following: bash "install hubot" do user hubot_user group hubot_group cwd install_dir code <<-EOH wget https://github.com/downloads/github/hubot/hubot-#{node['hubot']['version']}.tar.gz && \ tar xzvf hubot-#{node['hubot']['version']}.tar.gz && \ cd hubot && \ npm install EOH end However, when I try to run chef-client on the server installing the cookbook, I'm getting a permission denied writing to the directory of the user that runs chef-client, not the hubot user. For some reason, npm is trying to run under the wrong user, not the user specified in the bash resource. I am able to run sudo su - hubot -c "npm install /usr/local/hubot/hubot" manually, and this gets the result I want (installs hubot as the hubot user). However, it seems chef-client isn't executing the command as the hubot user. Below you'll find the chef-client execution. Thank you in advance. Saving to: `hubot-2.1.0.tar.gz' 0K ...... 100% 563K=0.01s 2012-01-23 12:32:55 (563 KB/s) - `hubot-2.1.0.tar.gz' saved [7115/7115] npm ERR! Could not create /home/<user-chef-client-uses>/.npm/log/1.2.0/package.tgz npm ERR! Failed creating the tarball. npm ERR! couldn't pack /tmp/npm-1327339976597/1327339976597-0.13104878342710435/contents/package to /home/<user-chef-client-uses>/.npm/log/1.2.0/package.tgz npm ERR! error installing [email protected] Error: EACCES, permission denied '/home/<user-chef-client-uses>/.npm/log' ... npm not ok ---- End output of "bash" "/tmp/chef-script20120123-25024-u9nps2-0" ---- Ran "bash" "/tmp/chef-script20120123-25024-u9nps2-0" returned 1

    Read the article

  • Alternative to cPanel (For Email, ect)

    - by Dboy1612
    I'm currently setting up a VPS for the first time. Standard that I've ever worked with before on shared hosting was cPanel, but as the majority of my work I plan on doing from now on will be using NodeJS and Python/Flask, I'd like to avoid needing to install Apache/MySQL/PHP. What would be my best bet to help manage a mail server other than cPanel? Or even other specific server settings that may come in handy later? Plan on using Ubuntu if that counts for anything.

    Read the article

  • JS: using 'var me = this' to reference an object instead of using a global array

    - by Marco Demaio
    The example below, is just an example, I know that I don't need an object to show an alert box when user clicks on div blocks, but it's just a simple example to explain a situation that frequently happens when writing JS code. In the example below I use a globally visible array of objects to keep a reference to each new created HelloObject, in this way events called when clicking on a div block can use the reference in the arry to call the HelloObject's public function hello(). 1st have a look at the code: <!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd"> <html><head><meta http-equiv="Content-Type" content="text/html; charset=windows-1252"> <title>Test </title> <script type="text/javascript"> /***************************************************** Just a cross browser append event function, don't need to understand this one to answer my question *****************************************************/ function AppendEvent(html_element, event_name, event_function) {if(html_element) {if(html_element.attachEvent) html_element.attachEvent("on" + event_name, event_function); else if(html_element.addEventListener) html_element.addEventListener(event_name, event_function, false); }} /****************************************************** Just a test object ******************************************************/ var helloobjs = []; var HelloObject = function HelloObject(div_container) { //Adding this object in helloobjs array var id = helloobjs.length; helloobjs[id] = this; //Appending click event to show the hello window AppendEvent(div_container, 'click', function() { helloobjs[id].hello(); //THIS WORKS! }); /***************************************************/ this.hello = function() { alert('hello'); } } </script> </head><body> <div id="one">click me</div> <div id="two">click me</div> <script type="text/javascript"> var t = new HelloObject(document.getElementById('one')); var t = new HelloObject(document.getElementById('two')); </script> </body></html> In order to achive the same result I could simply replace the code //Appending click event to show the hello window AppendEvent(div_container, 'click', function() { helloobjs[id].hello(); //THIS WORKS! }); with this code: //Appending click event to show the hello window var me = this; AppendEvent(div_container, 'click', function() { me.hello(); //THIS WORKS TOO AND THE GLOBAL helloobjs ARRAY BECOMES SUPEFLOUS! }); thus would make the helloobjs array superflous. My question is: does this 2nd option in your opinion create memoy leaks on IE or strange cicular references that might lead to browsers going slow or to break??? I don't know how to explain, but coming from a background as a C/C++ coder, doing in this 2nd way sounds like a some sort of circular reference that might break memory at some point. I also read on internet about the IE closures memory leak issue http://jibbering.com/faq/faq_notes/closures.html (I don't know if it was fixed in IE7 and if yes, I hope it does not come out again in IE8). Thanks

    Read the article

< Previous Page | 31 32 33 34 35 36 37 38 39 40 41 42  | Next Page >