Search Results

Search found 626 results on 26 pages for 'frontend'.

Page 14/26 | < Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >

  • Mobile Web Applications – A guide for professional development

    - by JuergenKress
    (Tobias Bosch, Stefan Scheidt, Torsten Winterberg / Opitz Consulting Deutschland GmbH). There is a real hype around mobile solutions. Smartphones and tablets are everywhere. Frontend architecture is changing quickly to adopt cross browser technologies like HTML5 and extensive JavaScript-based development. In this book we introduce our software development process to build test-driven Single-Page JavaScript Web Applications, which will be the future next to native apps. We start with a short introduction of our RYLC showcase (know from our SOA articles), give a very short introduction to JavaScript, then talk about jQuery Mobile, Angular JS, Testing, Backend-communication and we close with deploying our RYLC-Webapp as a hybrid app using the PhoneGap (Cordova) framework. Don’t expect too much theory – it’s a practical guide explaining how RYLC Web App was built, to kickstart your own development. Currently only available in German as paperback and eBook. WebLogic Partner Community For regular information become a member in the WebLogic Partner Community please visit: http://www.oracle.com/partners/goto/wls-emea ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center. BlogTwitterLinkedInMixForumWiki Technorati Tags: adf mobil

    Read the article

  • Using nginx and/or varnish to cache server-generated 301 redirects

    - by rlotun
    I'm implementing a sort of url-shortener service. What happens is that I have some backend app server that takes in a request, does some computation and returns a 301 redirected url back upstream to an nginx frontend: request ---> nginx ----> app_server What I want to be able to do is cache this returned 301 url for the same request (a specific url with a "short code"). Does nginx do this caching automatically? Or should I drop in something like varnish in between nginx and the app_server? I can easily cache this in memcache, but that would require hitting the app_server, which I'm sure can be dispensed with after the first request. Thanks.

    Read the article

  • How do I create DNS entries for EC2 instances created by Auto Scaling?

    - by Evan
    I'm looking into using auto scaling groups for a tier of webservers that would be fronted by an ELB. One of the things I'm having a hard time with is how to give each new instance the proper DNS name. For example, I'd like webservers to have names like frontend-web-XXX.prod.example.com so their names would appear correct in logs and just ease of organization. I have two other tiers I'd ultimately like to make autoscaled and I'd like them to have names like api-web-XXX.prod.example.com as well. I have some experience with cloudformation templates and have spun up individual instances with associated Route53 records but I don't see any indication of how this can be done within an autoscaled group.

    Read the article

  • Configuring Linux Network

    - by Reiler
    Hi I'm working on some software, that runs on a Centos 5.xx installation. I'ts not allowed for our customers to log in to Linux, everything is done from Windows applications, developed by us. So we have build a frontend for the user to configure network setup: Static/DHCP, ip-address, gateway, DNS, Hostname. Right now I let the user enter the information in the Windows app, and then write it on the Linux server like this: Write to /etc/resolv.conf: Nameserver Write to /etc/sysconfig/network: Gateway and Hostname Write to /etc/sysconfig/network-scripts/ifcfg-eth0: Ipaddress, Netmask, Bootproto(DHCP or Static) I also (after some time) found out that I was unable to send mail, unless I wrote in /etc/hosts: 127.0.0.1 Hostname All this seems to work, but is there a better/easier way to do this? Also, I read the network configuration nearly the same way, but if I use DHCP, I miss som information, for instance the Ip-address. I know that I can get some information from the commandline (ifconfig), but I dont get for instance Hostname, Gateway and DNS. Is there a commandline tool that will display this?

    Read the article

  • Symfony sfControlPanel 500 error only when executing 'tasks'

    - by rrlange
    As a bit of background, I recently had to restore a Symfony site from backup. Ever since the 'successful' restore, I am running into an exception when trying to execute a(ny) task via the web-based sfControlPanel: Unable to find PHP executable stack trace * at sfToolkit::getPhpCli() in SF_ROOT_DIR/plugins/sfControlPanelPlugin/modules/sfControlPanel/actions/actions.class.php line 93 FYI: symfony version 1.0.6 PHP 5.2.6 (cli) (built: May 2 2008 16:06:40) Apache/2.2.3 CentOS 5.* Thank you very much for any and all suggestions as to what might be amiss. Addendum: I neglected to mention that I can the frontend app (and certain backend apps) perfectly fine via the web. I am also able to run common tasks via the command line (cc etc.).

    Read the article

  • Doesn't work Nginx + SSI [migrated]

    - by boopidoopi
    I have some problems. Nginx doesn't work with SSI. Nginx listens 80 port (frontend), apache2 listens 81 port (backend). That is my nginx configurations: server { listen 80; server_name test.dev www.test.dev; error_log /var/log/nginx/error.log debug; log_subrequest on; location / { ssi on; proxy_pass http://localhost:81; proxy_redirect off; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; client_max_body_size 15m; client_body_buffer_size 128k; } } SSI include in test.dev index.php: <!--# include virtual="http:test.dev/test.html" -- When I open test.dev/index.php I see clean page. In page source: <!--# include virtual="http:test.dev/test.html" -- So how to enable SSI in nginx? Can you help me?

    Read the article

  • Can certain system-hungry modules be disabled in Ubuntu?

    - by Ole Thomsen Buus
    Hi, Let me add some context: I am currently using Ubuntu 9.10 64-bit (Desktop) on a relatively powerful stationary PC (Intel Core i7 920, 12GB ram). My purpose is highspeed imaging with a pointgrey Grashopper machine-vision camera (for research, PhD project). This camera is capable of 200 fps at full VGA (640x480) resolution. The camera is connected using Firewire (1394b) and the drivers and software from Pointgrey works great. I have developed a console C++ application that can grap a certain number of frames to preallocated memory and after this also save the grapped frames to harddrive. Currently it works fine but sometimes I am observing a few framedrops (1-3). When this happens I reset the experiment and repeat the recording and usually i am lucky the second time with no framedrops (the camera-driver has a internal framecounter that I am using). Question: I usually go to tty1 and use "sudo service gdm stop" to disable the graphical frontend. It seems to release some memory though that is not my main concern. My concern is CPU resources. Are there other system hungry modules that can be disabled temporarily such that the CPU gets less busy on Ubuntu 9.10? At some point in the future I will update to 10.10. Should I perhaps option for the server edition instead? Thanks.

    Read the article

  • Is Sql Azure useful without windows azure?

    - by KallDrexx
    I am currently doing some research to get some preliminary IT cost projections for a project, and I was looking at Azure. Since this is a startup, I do not want to deal with the IT operations myself and instead am looking at having it all professionally hosted. I am looking at azure due to the SLA assurances, already in place disaster recovery operations, and the reliability. I'm playing with some numbers, and I am wondering if hosting my database on Sql Azure is an option, while hosting the actual webpage on another host until I need the frontend scalability of Azure. Is this actually feasible or will the latency in requests between the web host and azure be too much and I would be better off hosting both on the same service?

    Read the article

  • nginx static file buffer

    - by Philip
    I have a nfs which several frontend-servers are connected to for making the files stored on the nfs available for http downloads. It looks like I have problems with the way apache is serving the files, there seems to be a very small buffer or no buffer at all which results in a lot disk seeks. I did some testing with loading the whole requested file into memory at once and serve it to the client from memory. With this technique I need less disk seeks for a download stream. Since I don't want to implement this by myself for production use I thought that I could maybe use nginx for that because the documentation says that it uses buffers for static file serving. Is it possible to increase the buffer size to a few mb, if so which config parameter do I have to change for this? Has anyone experience with large buffers for static file serving? Is there a better way to reduce disk seeks?

    Read the article

  • How to stick my changes in httpd.conf on WHM/Cpanel/EasyApache

    - by Seiti
    I'm setting up a server and trying to configure the Apache. It only needs to work as a frontend to Tomcat. To do that I added some instructions to the VirtualHost directive, using mod_proxy: <VirtualHost *> ServerName myserver.domain.com ProxyRequests Off ProxyPass / http://myserver.domain.com:8080/ ProxyPassReverse / http://myserver.domain.com:8080/ </VirtualHost> It works fine, and if the need comes, I´ll use mod_jk. But, how do I do it the right way using easyapache, and stop it to always rewrite my changes.

    Read the article

  • What are good and bad jitter times for a LAN

    - by garyb32234234
    Ive just ran jperf (frontend to iperf) on our network between 2 workstations, its recorded jitter between 0.033ms and 0.048ms. Is this good or bad? Are there more variables that i would need to consider to make the decision? EDIT: TCP/IP Ethernet LAN 43 PCs 1 server, 100Mbits main switch, various small 8 port switches, test was done using UDP, Its a Windows Domain. I want to instal a few voip softphones on the workstations, see how many i can use that reliably work, im testing a few different workstations around the network to see where the best quality network paths are. Will also change some equipment if i identify bad connections.

    Read the article

  • API server not function ["The connection has been reset"]

    - by Miguel Beltrán
    I'm having some troubles with one of my servers. I've done an application with two servers, one the frontend that grabs the data of server API (Ubuntu server). Well, yesterday had a lot of visits and the API server stop functioning but: -I can do stuff in MySQL by SSH. -The memory usage is ok. -The logs are ok. -The bandwitch usage is ok. -If i restart the server or Apache2, function by some time (3-4 minutes). And the most important i think if i tries to access to API (Is rest-style with http) it puts me the Firefox error "The connection has been reset". I'd tried: -Restart the server -Restart Apache2 -Restart MySQL -Viewed the logs of Apache2/MySQL I don't know too much about systems so i don't know what to do more.

    Read the article

  • .NET app - Should we use SQL Server and duplicate some reference data from an external Oracle DB? Or use Oracle and have a DB link?

    - by Daventry
    We're looking to migrate some existing Excel/Access processes into a new system which will provide the users with a Silverlight frontend to run and view the reports instead of using MS Access. The initial idea was to have SQL Server 2008 as RDBMS. The problem is that we've got some static data such as country codes, counterparties, etc which live in an existing Oracle DB. Since we do not want to duplicate that data (if possible), we were thinking of having a DB link between SQL Server and Oracle, but our firm does not allow that. So the options are either duplicate the data or use Oracle as RDBMS - surprise, the firm does allow DB links between Oracle databases. The initial idea was also to use WCF RIA Services, Entity Framework, etc which we're not sure they play well with Oracle, that's why it was decided to go with SQL Server in the first place. Would you advise to go for Oracle so that we can just link the static data? Or use SQL Server 2008 and replicate it because it's "safer" to stay within the Microsoft land? To use or not to use Entity Framework and WCF RIA Services at all? Regards. UPDATE: Thanks everyone for your answers. Nothing is set in stone yet. We'll try to import the data instead of linking, as if the other DB goes down, our system can still carry on. We're likely to use SQL Server just because most developers are more experienced with it. Even if we used RIA Services, we can swap out the Data Access Layer and use other frameworks such those mentioned below.

    Read the article

  • Apache mod_remoteip and access logs

    - by GioMac
    Since Apache 2.4 I've started using mod_remoteip instead of mod_extract_forwarded for rewriting client address from x-forwarded-for provided by frontend servers (varnish, squid, apache etc). So far everything works fine with the modules, i.e. php, cgi, wsgi etc... - client addresses are shown as they should be, but I couldn't write client address in access logs (%a, %h, %{c}a). No luck - I'm always getting 127.0.0.1 (localhost forward ex.). How to log client's ip address when using mod_remoteip?

    Read the article

  • nagios ldap-group based front end login permission issues

    - by Eleven-Two
    I want to grant users access to the nagios 3 core frontend by using an active directory group ("NagiosWebfrontend" in the code below). The login works fine like this: AuthType Basic AuthName "Nagios Access" AuthBasicProvider ldap AuthzLDAPAuthoritative on AuthLDAPURL "ldap://ip-address:389/OU=user-ou,DC=domain,DC=tld?sAMAccountName?sub?(objectClass=*)" AuthLDAPBindDN CN=LDAP-USER,OU=some-ou,DC=domain,DC=tld AuthLDAPBindPassword the_pass Require ldap-group CN=NagiosWebfrontend,OU=some-ou,DC=domain,DC=tld Unfortunately, every nagios page just shows "It appears as though you do not have permission to view information for any of the services you requested...". I got the hint, that I am missing a contact in nagios configuration which is equal to my login, but creating one with the same name as the domain user had no effect on this issue. However, it would be great to find a solution without manually editing nagios.conf for every new user, so the admins could grant access to nagios by just putting the user to "NagiosWebfrontend" group. What would be the best way to solve it?

    Read the article

  • Choosing a CMS to use with backend modules involving haskell and python [on hold]

    - by Butterflycode
    Hi I am trying to decide on a CMS to use for a new project. Security is the most important element of the CMS. I am looking to use a PHP based CMS such as Joomla or Drupal however, PHP has many security flaws which worries me. The data which needs to be secure will be inside a database and relate to account information. I am wondering what is the best way to do this? What I am wanting is a frontend which is made in php/js(joomla) and then I have a backend api which is written in Haskell to handle money transfers ensuring nothing goes wrong. In between the two I want a controller written in perhaps Python or C. I never want the php to touch the database. I want it to relay messages to the controller that's written in python or C and then it inputs to the database, sanitising data etc Am I perhaps thinking too deeply about this? Just wondering if anyone has any ideas on what I should do.... I can't quite explain what the project is as I don't want the idea to be stolen, but it has a lot money transactions involved so security is essential.

    Read the article

  • What are most demanded web-development languages today for startups?

    - by Liston Catch
    What technologies are in demand nowaydas for web-development for web-startups? For frontend its all clear: HTML5, JS, AJAX, JQuery. But what about backend? What languages (and frameworks) should I consider using? I am not asking "which language is best", I just need a list of modern languages and frameworks (and not Pascal, Delphi or Basic) which are demanded and well-payed. UPD: I totally decline the "it's all about logic, not about language. language is just a tool" concept. While THEORETICALLY it's true, in reality the time you need to study required frameworks is counted by months, so language DOES matter indeed. That's why I made this topic UPD 2: Mason Wheeler, so you seriously advice me to go for Delphi? You think its DEMANDED nowadays? Or you just tell me an exception which only confirms the rule? It's like "one guy won 100,000,000$ in lottery. Just for you to know that lottery is not a bad way to earn money."

    Read the article

  • Did I Inadvertently Create a Mediator in my MVC?

    - by SoulBeaver
    I'm currently working on my first biggish project. It's a frontend facebook application that has, since last Tuesday, spanned some 6000-8000 LOC. I say this because I'm using the MVC, an architecture I have never rigidly enforced in any of my hobby projects. I read part of the PureMVC book, but I didn't quite grasp the concept of the Mediator. Since I didn't understand and didn't see the need for it, my project has yet to use a single mediator. Yesterday I went back to the design board because of some requirement changes and noticed that I could move all UI elements out of the View and into its own class. The View essentially only managed the lifetime of the UI and all events from the UI or Model. Technically, the View has now become a 'Mediator' between the Model and UI. Therefore, I realized today, I could just move all my UI stuff back into the View and create a mediator class that handles all events from the view and model. Is my understanding correct in thinking that I have devolved my View as it currently is (handling events from the Model and UI) into a Mediator and that the UI class is what should be the View?

    Read the article

  • MS Access 2007 end user access

    - by LtDan
    I need some good advise. I have used Access for many years and I use Sharepoint but never the two combined. My newly created Access db needs to be shared with many users across the organization. The back end is SQL and the old way to distribute the database would be placing the db on a shared drive, connecting their PC ODBC connections to the SQL db and then they would open the database and have at it. This has become the OLD way. What is the best (and simpliest) way to allow the end users to utilize a frontend for data entry/edit reporting etc. Can I create a link through SharePoint and the user just open it from there. Your good advise is greatly approciated.

    Read the article

  • For a particular domain, how can I cache its JSON responses locally?

    - by Chris
    I'm coding the frontend of a web app that uses XHR to grab JSON data from a 3rd party. The 3rd party service is slow and because of its API design, we need to make a LOT of API requests every time I refresh the page to test some new code. It's making the development loop painful. The requests are GETs, POSTs and PUTs even though I'm pretty sure none of the requests are changing state. I want to go to localhost for the JSON rather than to this 3rd party API - simply to make my development process faster.

    Read the article

  • Searching for online database software/cms

    - by ButterdBread
    I am searching for a software or CMS that manages and displays large online databases, as some kind of frontend to MySQL or any other database. It should be accessible through the browser, be as secure as possible (offering login). The data I'd like to store would be personal information such as name, adress and birthday - also I'd need to be able to add custom fields as well. Also forms and the possibility to download the data in an excel? table would be great. PHPmyadmin is not an option, it should be similar to a CRM but more closely adapted to managing database tables, searching for entries and filtering data. It should be possible to have many user accounts with different rights, with each of them being able to acces certain parts of the data and entering own data. Is there something out there, that might get close to what I imagine? I appreciate any help!

    Read the article

  • uploading files greater than 1MB = connection resets

    - by Legit
    I'm using nginx on the frontend as "proxy cache" and apache on the backend, i've set my PHP settings to the following: error_log = /var/www/site1/php_error.log error_reporting = 22527 file_uploads = On log_errors = On max_execution_time = 0 max_file_uploads = 20 max_input_time = -1 memory_limit = 512M post_max_size = 0 upload_max_filesize = 1000M What's the problem? Uploading files less than 1MB is successful but anything greater than that, Google Chrome outputs: Error 101 (net::ERR_CONNECTION_RESET): The connection was reset. I already checked for the error log file but it doesn't exist in the directory. I also checked /var/log/httpd/error_log but no uploading related problems. I don't know anything else which might have caused the problem so I have reached out for your helping hand. Thanks!

    Read the article

  • How do I generate and post XML in c# [migrated]

    - by user2922687
    I am a new to c# and faced with a similar problem. I need to generate and post XML to a URL but the parameter in the XML fields should dynamic getting inputs from a frontend app. This layout of the XML <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:ret="http://xxx.xxx.com"> <soapenv:Header/> <soapenv:Body> <ret:Vend> <DestAccount >?</DestAccount> <Amount>?</Amount> <Msg>?</Msg> <SequenceNo>?</SequenceNo> <DealerNo>?</DealerNo> <Password>?</Password> </ret:Vend> </soapenv:Body> </soapenv:Envelope> Can anyone assist on how I can generate this in c#?

    Read the article

  • solutions for a webserver dedicated to manage permissions/ACL and (reverse) proxying API servers?

    - by giohappy
    I'm considering various layouts to expose various HTTP API services (running on their own differents servers) through a frontend server dedicated to manage permissions on behalf of the API services. I've considered various options, from the classical ones like Nginx, Apache, etc. to HAProxy, passing by the various Python webserver solutions like Tornado, Twisted (which gives me the opportunity to implement my own ACL system easily). The foundamental feature is high performance and scalability, and the ability to manage fine grained ACL rules (similar to the HAProxy ACL system) I would like to know what is a suggested approach to setup what I need, and if (opne source) ready-to-use solutions are already available dedicated to this.

    Read the article

  • apt-get install Error

    - by LINUX4U
    syslogd during install give following error from the server? How to diagnose this problem debconf: falling back to frontend: Readline Selecting previously deselected package sysklogd. (Reading database ... 32541 files and directories currently installed.) Unpacking sysklogd (from .../sysklogd_1.5-5ubuntu4_amd64.deb) ... Selecting previously deselected package klogd. Unpacking klogd (from .../klogd_1.5-5ubuntu4_amd64.deb) ... Setting up sysklogd (1.5-5ubuntu4) ... * Starting system log daemon... [ OK ] Setting up klogd (1.5-5ubuntu4) ... * Starting kernel log daemon... [fail]

    Read the article

< Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >