Search Results

Search found 10280 results on 412 pages for 'technology choice'.

Page 121/412 | < Previous Page | 117 118 119 120 121 122 123 124 125 126 127 128  | Next Page >

  • 7-Eleven Improves the Digital Guest Experience With 10-Minute Application Provisioning

    - by MichaelM-Oracle
    By Vishal Mehra - Director, Cloud Computing, Oracle Consulting Making the Cloud Journey Matter There’s much more to cloud computing than cutting costs and closing data centers. In fact, cloud computing is fast becoming the engine for innovation and productivity in the digital age. Oracle Consulting Services contributes to our customers’ cloud journey by accelerating application provisioning and rapidly deploying enterprise solutions. By blending flexibility with standardization, our Middleware as a Service (MWaaS) offering is ensuring the success of many cloud initiatives. 10-Minute Application Provisioning Times at 7-Eleven As a case in point, 7-Eleven recently highlighted the scope, scale, and results of a cloud-powered environment. The world’s largest convenience store chain is rolling out a Digital Guest Experience (DGE) program across 8,500 stores in the U.S. and Canada. Everyday, 7-Eleven connects with tens of millions of customers through point-of-sale terminals, web sites, and mobile apps. Promoting customer loyalty, targeting promotions, downloading digital coupons, and accepting digital payments are all part of the roadmap for a comprehensive and rewarding customer experience. And what about the time required for deploying successive versions of this mission-critical solution? Ron Clanton, 7-Eleven's DGE Program Manager, Information Technology reported at Oracle Open World, " We are now able to provision new environments in less than 10 minutes. This includes the complete SOA Suite on Exalogic, and Enterprise Manager managing both the SOA Suite, Exalogic, and our Exadata databases ." OCS understands the complex nature of innovative solutions and has processes and expertise to help clients like 7-Eleven rapidly develop technology that enhances the customer experience with little more than the click of a button. OCS understood that the 7-Eleven roadmap required careful planning, agile development, and a cloud-capable environment to move fast and perform at enterprise scale. Business Agility Today’s business-savvy technology leaders face competing priorities as they confront the digital disruptions of the mobile revolution and next-generation enterprise applications. To support an innovation agenda, IT is required to balance competing priorities between development and operations groups. Standardization and consolidation of computing resources are the keys to success. With our operational and technical expertise promoting business agility, Oracle Consulting's deep Middleware as a Service experience can make a significant difference to our clients by empowering enterprise IT organizations with the computing environment they seek to keep up with the pace of change that digitally driven business units expect. Depending on the needs of the organization, this environment runs within a private, public, or hybrid cloud infrastructure. Through on-demand access to a shared pool of configurable computing resources, IT delivers the standard tools and methods for developing, integrating, deploying, and scaling next-generation applications. Gold profiles of predefined configurations eliminate the version mismatches among databases, application servers, and SOA suite components, delivered both by Oracle and other enterprise ISVs. These computing resources are well defined in business terms, enabling users to select what they need from a service catalog. Striking the Balance between Development and Operations As a result, development groups have the flexibility to choose among a menu of available services with descriptions of standard business functions, service level guarantees, and costs. Faced with the consumerization of enterprise IT, they can deliver the innovative customer experiences that seamlessly integrate with underlying enterprise applications and services. This cloud-powered development and testing environment accelerates release cycles to ensure agile development and rapid deployments. At the same time, the operations group is relying on certified stacks and frameworks, tuned to predefined environments and patterns. Operators can maintain a high level of security, and continue best practices for applications/systems monitoring and management. Moreover, faced with the challenges of delivering on service level agreements (SLAs) with the business units, operators can ensure performance, scalability, and reliability of the infrastructure. The elasticity of a cloud-computing environment – the ability to rapidly add virtual machines and storage in response to computing demands -- makes a difference for hardware utilization and efficiency. Contending with Continuous Change What does it take to succeed on the promise of the cloud? As the engine for innovation and productivity in the digital age, IT must face not only the technical transformations but also the organizational challenges of the cloud. Standardizing key technologies, resources, and services through cloud computing is only one part of the cloud journey. Managing relationships among multiple department and projects over time – developing the management, governance, and monitoring capabilities within IT – is an often unmentioned but all too important second part. In fact, IT must have the organizational agility to contend with continuous change. This is where a skilled consulting services partner can play a pivotal role as a trusted advisor in the successful adoption of cloud solutions. With a lifecycle services approach to delivering innovative business solutions, Oracle Consulting Services has expertise and a portfolio of services to help enterprise customers succeed on their cloud journeys as well as other converging mega trends .

    Read the article

  • UK OUG Conference Highlights and Insights

    - by Richard Bingham
    As per my preemptive post, this was the first time the annual conference organized by the UK Oracle User Group (UKOUG) was split into two events, one for Oracle Applications and another in December for Oracle Technology. Apps13, as it was branded, was hailed as a success, with over 1000 registered attendees and three days of sessions, exhibition, round-tables and many other types of content. As this poster on their stand illustrates, the UKOUG is a strong community with popular participants from both big and small Oracle partners and customers. The venue was a more intimate setting than previous years also, allowing everyone to casually bump into those they hoped to. It gave a real feeling of an Apps Community. The main themes over the days where CRM and Customer Experience, HCM, and FIN/SCM. This allowed people to attend just one focused day if they wanted. In addition the Apps Transformation stream ran across all three days, offering insights, advice, and details on the newer product solutions like Fusion Applications.  Here are some of the key take-aways I got from the conference, specific to my role in Fusion Applications Developer Relations: User Experience continues to be a significant reason for adopting some of the newer application products available, with immediately obvious gains in user productivity and satisfaction reported by customers. Also this doesn't stop with the baked-in UX either, with their Design Patterns proving popular and indeed currently being extended to including things like extending on ADF mobile and customizing the Simplified UI. More on this to come from us soon. The executive sessions emphasized the "it's a journey" phrase, illustrating that modern business applications are powered by technologies such as Cloud, Mobile, Social and Big Data and these can be harnessed to help propel your organization forward. Indeed the emphasis is away from the traditional vendor prescribed linear applications road map, and towards plotting a course based on business priorities supported by a broad range of integrated solutions. To help with this several conference sessions demoed the new "Applications Navigator" tool, developed in partnership with OUG members, which offers a visual framework to help organizations plan their Oracle Applications investments around business and technology imperatives. Initial reaction was positive, especially as customers do not need to decipher Oracle's huge product catalog and embeds the best blend of proven and integrated applications solutions. We'll share more on this when it is generally available. Several sessions focused around explanations and interpretation of Oracle OpenWorld 2013, helping highlight the key Oracle Applications messages and directions. With a relative small percentage of conference attendees also at OpenWorld (from a show of hands) this was a popular way to distill the information available down into specific items of interest for the community. Please note the original OpenWorld 2013 content is still available for download but will not remain available forever (via the Oracle website OpenWorld Content Catalog > pick a session > see the PDF download). With the release of E-Business Suite 12.2 the move to develop and deploy on the Fusion Middleware stack becomes a reality for many Oracle Applications customers. This coupled with recent E-Business Suite features such as the Integrated SOA Gateway and the E-Business Suite SDK for Java, illustrates how the gap between the technologies and techniques involved in extending E-Business Suite and Fusion Applications is quickly narrowing. We'll see this merging continue to evolve going forwards. Getting started with Oracle Cloud Applications is actually easier than many customers expected, with a broad selection of both large and medium sized organizations explaining how they added new features to their existing Oracle Applications portfolios. New functionality available from Fusion HCM and CX are popular extensions that do not have to disrupt those core business services. Coexistence is the buzzword here, and the available integration is also simpler than many expected, commonly involving an initial setup data load, then regularly incremental synchronizations, often without a need for real-time constant communication between systems. With much of this pre-built already the implementation process is also quite rapid. With most people dressed in suits, we wanted to get the conversations going without the traditional english reserve, so we decided to make ourselves a bit more obvious, as the photo below shows. This seemed to be quite successful and helped those interested identify and approach us. Keep a look out for similar again. In fact if you're in the UK there is an "Apps Transformation Day" planned by the UKOUG for the 19th March 2014, with more details to follow. Again something we'll be sure to participate in. I am hoping to attend the next half of the UKOUG annual conference, Tech13, that focuses more on Oracle technology and where there is more likely to be larger attendance of those interested in the lower-level aspects of applications customization and development. If you're going, let me know and maybe we can meet up.

    Read the article

  • Clouds, Clouds, Clouds Everywhere, Not a Drop of Rain!

    - by sxkumar
    At the recently concluded Oracle OpenWorld 2012, the center of discussion was clearly Cloud. Over the five action packed days, I got to meet a large number of customers and most of them had serious interest in all things cloud.  Public Cloud - particularly the Oracle Cloud - clearly got a lot of attention and interest. I think the use cases and the value proposition for public cloud is pretty straight forward. However, when it comes to private cloud, there were some interesting revelations.  Well, I shouldn’t really call them revelations since they are pretty consistent with what I have heard from customers at other conferences as well as during 1:1 interactions. While the interest in enterprise private cloud remains to be very high, only a handful of enterprises have truly embarked on a journey to create what the purists would call true private cloud - with capabilities such as self-service and chargeback/show back. For a large majority, today's reality is simply consolidation and virtualization - and they are quite far off from creating an agile, self-service and transparent IT infrastructure which is what the enterprise cloud is all about.  Even a handful of those who have actually implemented a close-to-real enterprise private cloud have taken an infrastructure centric approach and are seeing only limited business upside. Quite a few were frank enough to admit that chargeback and self-service isn’t something that they see an immediate need for.  This is in quite contrast to the picture being painted by all those surveys out there that show a large number of enterprises having already implemented an enterprise private cloud.  On the face of it, this seems quite contrary to the observations outlined above. So what exactly is the reality? Well, the reality is that there is undoubtedly a huge amount of interest among enterprises about transforming their legacy IT environment - which is often seen as too rigid, too fragmented, and ultimately too expensive - to something more agile, transparent and business-focused. At the same time however, there is a great deal of confusion among CIOs and architects about how to get there. This isn't very surprising given all the buzz and hype surrounding cloud computing. Every IT vendor claims to have the most unique solution and there isn't a single IT product out there that does not have a cloud angle to it. Add to this the chatter on the blogosphere, it will get even a sane mind spinning.  Consequently, most  enterprises are still struggling to fully understand the concept and value of enterprise private cloud.  Even among those who have chosen to move forward relatively early, quite a few have made their decisions more based on vendor influence/preferences rather than what their businesses actually need.  Clearly, there is a disconnect between the promise of the enterprise private cloud and the current adoption trends.  So what is the way forward?  I certainly do not claim to have all the answers. But here is a perspective that many cloud practitioners have found useful and thus worth sharing. To take a step back, the fundamental premise of the enterprise private cloud is IT transformation. It is the quest to create a more agile, transparent and efficient IT infrastructure that is driven more by business needs rather than constrained by operational and procedural inefficiencies. It is the new way of delivering and consuming IT services - where the IT organizations operate more like enablers of  strategic services rather than just being the gatekeepers of IT resources. In an enterprise private cloud environment, IT organizations are expected to empower the end users via self-service access/control and provide the business stakeholders a transparent view of how the resources are being used, what’s the cost of delivering a given service, how well are the customers being served, etc.  But the most important thing to note here is the enterprise private cloud is not just an IT project, rather it is a business initiative to create an IT setup that is more aligned with the needs of today's dynamic and highly competitive business environment. Surprised? You shouldn’t be. Just remember how the business users have been at the forefront of public cloud adoption within enterprises and private cloud is no exception.   Such a broad-based transformation makes cloud more than a technology initiative. It requires people (organizational) and process changes as well, and these changes are as critical as is the choice of right tools and technology. In my next blog,  I will share how essential it is for enterprise cloud technology to go hand-in hand with process re-engineering and organization changes to unlock true value of  enterprise cloud. I am sharing a short video from my session "Managing your private Cloud" at Oracle OpenWorld 2012. More videos from this session will be posted at the recently introduced Zero to Cloud resource page. Many other experts of Oracle enterprise private cloud solution will join me on this blog "Zero to Cloud"  and share best practices , deployment tips and information on how to plan, build, deploy, monitor, manage , meter and optimize the enterprise private cloud. We look forward to your feedback, suggestions and having an engaging conversion with you on this blog.

    Read the article

  • State of the (Commerce) Union: What the healthcare.gov hiccups teach us about the commerce customer experience

    - by Katrina Gosek
    Guest Post by Brenna Johnson, Oracle Commerce Product A lot has been said about the healthcare.gov debacle in the last week. Regardless of your feelings about the Affordable Care Act, there’s a hidden issue in this story that most of the American people don’t understand: delivering a great commerce customer experience (CX) is hard. It shouldn’t be, but it is. The reality of the government’s issues getting the healthcare site up and running smooth is something we in the online commerce community know too well.  If there’s one thing the botched launch of the site has taught us, it’s that regardless of the size of your budget or the power of an executive with a high-profile project, some of the biggest initiatives with the most attention (and the most at stake) don’t go as planned. It may even give you a moment of solace – we have the same issues! But why?  Organizations engage too many separate vendors with different technologies, running sections or pieces of a site to get live. When things go wrong, it takes time to identify the problem – and who or what is at the center of it. Unfortunately, this is a brittle way of setting up a site, making it susceptible to breaks, bugs, and scaling issues. But, it’s the reality of running a site with legacy technology constraints in today’s demanding, customer-centric market. This approach also means there’s also a lot of cooks in lots of different kitchens. You’ve got development and IT, the business and the marketing team, an external Systems Integrator to bring it all together, a digital agency or consultant, QA, product experts, 3rd party suppliers, and the list goes on. To complicate things, different business units are held responsible for different pieces of the site and managing different technologies. And again – due to legacy organizational structure and processes, this is all accepted as the normal State of the Union. Digital commerce has been commonplace for 15 years. Yet, getting a site live, maintained and performing requires orchestrating a cast of thousands (or at least, dozens), big dollars, and some finger-crossing. But it shouldn’t. The great thing about the advent of mobile commerce and the continued maturity of online commerce is that it’s forced organizations to think from the outside, in. Consumers – whether they’re shopping for shoes or a new healthcare plan – don’t care about what technology issues or processes you have behind the scenes. They just want it to work.  They want their experience to be easy, fast, and tailored to them and their needs – whatever they are. This doesn’t sound like a tall order to the American consumer – especially since they interact with sites that do work smoothly.  But the reality is that it takes scores of people, teams, check-ins, late nights, testing, and some good luck to get sites to run, and even more so at Black Friday (or October 1st) traffic levels.  The last thing on a customer’s mind is making excuses for why they can’t buy a product – just get it to work. So what is the government doing? My guess is working day and night to get the site performing  - and having to throw big money at the problem. In the meantime they’re sending frustrated online users to the call center, or even a location where a trained “navigator” can help them in-person to complete their selection. Sounds a lot like multichannel commerce (where broken communication between siloed touchpoints will only frustrate the consumer more). One thing we’ve learned is that consumers spend their time and money with brands they know and trust. When sites are easy to use and adapt to their needs, they tend to spend more, come back, and even become long-time loyalists. Achieving this may require moving internal mountains, but there’s too much at stake to ignore the sea change in how organizations are thinking about their customer. If the thought of re-thinking your internal teams, technologies, and processes sounds like a headache, think about the pain associated with losing valuable customers – and dollars. Regardless if you’re in B2B or B2C, it’s guaranteed that your competitors are making CX a priority. Those early to the game who have made CX a priority have already begun to outpace their competition. So as you’re planning for 2014, look to the news this week. Make sure the customer experience is a focus at your organization. Expectations are at record highs. Map your customer’s journey, and think from the outside, in. How easy is it for your customers to do business with you? If they interact with many touchpoints across your organization, are the call center, website, mobile environment, or brick and mortar location in sync? Do you have the technology in place to achieve this? It’s time to give the people what they want!

    Read the article

  • How to move the Windows 7 bootloader to the Windows 7 partition?

    - by pauldoo
    I recently installed Windows 7 in a triple boot setup alongside XP and Linux. When I was finished and was in the process of restoring the bootloader for Linux I discovered something strange about what Windows 7 had done. I discovered that Windows 7 had not installed a bootloader to it's own partition, and instead had instead set up a bootloader on the pre-existing XP partition that offers a choice between 7 and XP. This behaviour has been noticed by others. Now my booting is slightly odd. I have GRUB on the MBR which lets me choose between Linux and Windows. When I select Windows I have Grub boot to the XP partition where I get the 2nd choice between 7 and XP. Why doesn't the Windows 7 installer put the Windows 7 bootloader on the Windows 7 partition like all previous MS OSs? This is now going to be a real problem for me, as I now want to wipe the XP partition and install something else there (probably another non-MS OS). How can I move the bootloader for Windows 7 onto the Windows 7 partition, thus making it bootable and allowing me to safely wipe the XP partition?

    Read the article

  • How to get german QWERTY on Windows?

    - by Arturas M
    Well I'm used to having the world standard keyboard which is qwerty and not qwertz... But on windows I can't find the choice for german input which would be qwerty, not qwertz. In linux there was german input with qwerty so it was fun. I believe it should be on Windows too? Cause i'm sick of this qwertz always having to correct and search for z or y... Yeah, I know, if Germans made a mistake, it wasn't loosing WW2... Edit: Maybe I wasn't clear enough, it's not about changing between different language inputs? It's about having all german keys in german input, just the z and y would be in the correct places like all the worlds keyboards use and like the US keyboard uses... Solution: Well unfortunately by searching everywhere and waiting for answers I couldn't find what I needed, so I came up with this solution - I used The The Microsoft Keyboard Layout Creator And created my own very custom layout by modifying the default german layout and switching places of z and y. In case somebody needs it and doesn't want to go the hard way, I decided to upload them, so here are the links to download and install the keyboard layout: http://www20.zippyshare.com/v/95071447/file.html http://rapidshare.com/files/3822150342/German%20QWERTY%202%20by%20Arturas%20M.zip It will appear as a choice under German input between languages in your language bar. If you just want to use this german layout, just remove all german layouts and install this one. Good luck and have fun using it!

    Read the article

  • Preferred mail system/server for a company?

    - by Trevoke
    Say you are responsible for setting up an email solution at a company. Which would be your choice? I know of the following options, but many of them not well: Gordano Mail System Exchange Exim Postfix Qmail Zimbra For having used it a little over two years, I really, really like Gordano Mail System. They offer a whole bunch of things, like calendaring, anti-spam, anti-virus, extremely complete and filterable logging options, aliases, a customizable webmail interface... And their software can be installed on both a Windows or Linux OS. In addition, their support is top-notch, their knowledgebase comprehensive (and, I will admit with a touch of pride, I have contributed, with my questions, to the addition of a few articles in there). Of course, they're not free, which can be a problem, but they're not Exchange, and they do offer pretty much everything that Exchange offers -- which is great if you want to stay away from that, but need all the features. Although, if you need a Blackberry Exchange Server, or something similar, I'm not sure what you should go for. So.. What would your choice be? Why? I've never played with a more DIY email solution, but I'm sure many people here have and wouldn't trade their setup for the world :)

    Read the article

  • Is having a [high-end] video card important on a server?

    - by Patrick
    My application is quite interactive application with lots of colors and drag-and-drop functionality, but no fancy 3D-stuff or animations or video, so I only used plain GDI (no GDI Plus, No DirectX). In the past my applications ran in desktops or laptops, and I suggested my customers to invest in a decent video card, with: a minimum resolution of 1280x1024 a minimum color depth of 24 pixels X Megabytes of memory on the video card Now my users are switching more and more to terminal servers, therefore my question: What is the importance of a video card on a terminal server? Is a video card needed anyway on the terminal server? If it is, is the resolution of the remote desktop client limited to the resolutions supported by the video card on the server? Can the choice of a video card in the server influence the performance of the applications running on the terminal server (but shown on a desktop PC)? If I start to make use of graphical libraries (like Qt) or things like DirectX, will this then have an influence on the choice of video card on the terminal server? Are calculations in that case 'offloaded' to the video card? Even on the terminal server? Thanks.

    Read the article

  • Installing Windows 7 over PXE, preferably with domain autojoin

    - by Ivan Vucica
    At an educational non-profit, I've inherited a previously set-up Windows domain that, after the first reinstall of the machines, we ended up not using by simply not joining machines back into the domain. Over last summer, before the annual reinstall for shipping machines to the summer school, I toyed with the idea of installing Windows 7 over network, instead of just imaging the machines. It took a bit longer than I expected to figure out the basics; honestly, I expected that Windows would be more friendly for PXE installation out of the box. What I'm interested in is best practices for installing Windows 7 over PXE with domain autojoin. I'd love it if the whole setup could optionally be hosted on a UNIX based system as well. I've had some success by preparing an ISO using Windows Deployment Kit, and loading the ISO into memory. This was needed since I wanted a menu, and I think I couldn't get PXELINUX to chainload into Windows' bootloader. Unfortunately, I couldn't figure out much about customization of the Windows setup in that timeframe nor could I get Samba to work properly; studying the stuff ended up being too lengthy, especially the portion where I edited a disk image on Windows and copied it outside. WDK didn't make things easier by mounting the disk image into RAM, and writing it in its entirety when done with it, making me a very sad boy. I've recently found a different approach, too, that appears to be closer to Microsoft's original idea for netboot deployment and does not involve ISOs. So my question boils down to the following. What exact approach do you use for netbooting Windows 7 setup? How can Windows 7 setup be best customized to be completely unattended, including installation on specific system partition and not destroying the data partition, creation of passworded admin and default user, choice of MAC-address-based hostname, and joining a domain? As much details as possible for everyone's future reference would be appreciated. WDS isn't a bad choice, but if a Linux-based install can be used, that'd be better.

    Read the article

  • Should I use nginx exclusively, or have it as a proxy to Tomcat (performance related)?

    - by Kevin
    I've planned to create a website that'll be pretty heavy on dynamic content, and want to know what would be the wisest choice for part of my webstack. Right now I'm trying to decide whether I should develop upon nginx, using PHP to deliver the dynamic content, or use nginx as a proxy to Tomcat and use servlets to deliver the dynamic content. I have a good amount of experience with Java, JSP, and servlets, so that's a plus right off the bat. Also, since it is a compiled language, it will execute faster than PHP (it is implied here that Java is around 37x faster than PHP) , and will create the web pages faster. I have no experience with PHP, however i'm under the impression that it is easy to pick up. It's slower than Java, but since the client will only be communicating with nginx, I'm thinking that serving the dynamically created web pages to the client will be faster this way. Considering these things, i'd like to know: Are my assumptions correct? Where does the bottleneck occur: creating pages or serving them back to the client? Will proxying Tomcat with nginx give me any of nginx performance benefits if I'm going to be using Tomcat to generate the dynamic content (keeping in mind my site is going to be heavy in this aspect)? I don't mind learning PHP if, in the end, its going to give me the best performance. I just want to know what would be the best choice from that standpoint.

    Read the article

  • maximum size filesystem on my test .... approach?

    - by jocco
    Hello all I'm new at the site, and I have a question. I got this question at a test and really like to know the correct approach to solving this problem? Here is the question. In an indexed filesystem the first indexblock (inode) has 12 direct pointers and 1 pointer to an indirect indexblock. The filesystem is implemented on a disk with a diskblock-size of 1024 bytes. All pointers are 32 bit. Question: what is the maximum filesize (Kilobytes) of this filesystem? If it's possible not an just an answer but an explanation. edit: It was a multiple choice btw with 4 answers a. 13 K b. 268 K c. 524 K d. 1036 K As for my approach I only got as far as to know that 1 pointer is 32 bit Also I found something else here on the site which seems very usefull. http://stackoverflow.com/questions/2755006/understanding-the-concept-of-inodes Ok i got this far There are 12 blocks and each block is 1024 bytes. 1024 * 12 = 12288 bytes or 12 KB directly accessible. Please correct me if I'm wrong. Each pointer is 32 Bit = 4Byte And to be honest at this point I'm starting to get confused especially since my answer is way over any of my multiple choice answers.

    Read the article

  • How to make a redundant desktop system with daily snapshots? (Is btrfs ready for use?)

    - by TestUser16418
    I want to configure a desktop system in which the home filesystem would be redundant (e.g. RAID-1), and would have weekly snapshots taken. I've already done this with ZFS, the snapshot system is wonderful, and with send/recv you can easily create backups on external media. Unfortunately, at that point, I want GNU+Linux and not FreeBSD or Solaris, so I'm looking for suggestions for good alternatives. I reckon that my alternatives are: btrfs - it seems to be exactly what I need, it has snapshots and commands that allow you to easily replicate zfs send. Yet all documentation mentions that it's still experimental. I can't seem to find any actual reports on its reliability or usability issues. Can you point me to any information on that issue that could clarify whether it would be a possible choice? I have a large preference for this option, mostly because I don't want to reformat the drives when btrfs becomes ready, but I there's no information on whether it's usable at all, whether it's a silly idea to use it, etc. The question that I cannot get the answer to is what does "experimental" mean. lvm snapshots and ext4 - preferably not, since it can consume an awful amount of space when new files are created. Creating 200 GB files requres 200 GB free space and 200 GB additionally for snapshots. I also have found it unreliable -- failed metadata rewrite results in an unreadable PV. I'm wondering how btrfs would compare here. A single filesystem (ext4) on a RAID-1 array with custom COW snapshots with hardlinks (like cp -al). That's my current preference if I can't use btrfs. So how experimental btrfs is, which should I choose, and do I have any other options? What if I don't keep external incremental backups, would that affect my choice?

    Read the article

  • Backbone events not firing (this.el undefined) & general feedback on use of the framework

    - by Leo
    I am very new to backbone.js and I am struggling a little. I figured out a way to get data from the server (in json) onto the screen successfully but am I doing it the right/best way? I know there is something wrong because the only view which contains a valid this.el is the parent view. I suspect that because of this, the events of the view are not firing ()... What is the best way forward? Here is the code: var surveyUrl = "/api/Survey?format=json&callback=?"; $(function () { AnswerOption = Backbone.Model.extend({}); AnswerOptionList = Backbone.Collection.extend({ initialize: function (models, options) { this.bind("add", options.view.render); } }); AnswerOptionView = Backbone.View.extend({ initialize: function () { this.answerOptionList = new AnswerOptionList(null, { view: this }); _.bindAll(this, 'render'); }, events: { "click .answerOptionControl": "updateCheckedState" //does not fire because there is no this.el }, render: function (model) { // Compile the template using underscore var template = _.template($("#questionAnswerOptionTemplate").html(), model.answerOption); $('#answerOptions' + model.answerOption.questionId + '>fieldset').append(template); return this; }, updateCheckedState: function (data) { //never hit... } }); Question = Backbone.Model.extend({}); QuestionList = Backbone.Collection.extend({ initialize: function (models, options) { this.bind("add", options.view.render); } }); QuestionView = Backbone.View.extend({ initialize: function () { this.questionlist = new QuestionList(null, { view: this }); _.bindAll(this, 'render'); }, render: function (model) { // Compile the template using underscore var template = _.template($("#questionTemplate").html(), model.question); $("#questions").append(template); //append answers using AnswerOptionView var view = new AnswerOptionView(); for (var i = 0; i < model.question.answerOptions.length; i++) { var qModel = new AnswerOption(); qModel.answerOption = model.question.answerOptions[i]; qModel.questionChoiceType = ChoiceType(); view.answerOptionList.add(qModel); } $('#questions').trigger('create'); return this; } }); Survey = Backbone.Model.extend({ url: function () { return this.get("id") ? surveyUrl + '/' + this.get("id") : surveyUrl; } }); SurveyList = Backbone.Collection.extend({ model: Survey, url: surveyUrl }); aSurvey = new Survey({ Id: 1 }); SurveyView = Backbone.View.extend({ model: aSurvey, initialize: function () { _.bindAll(this, 'render'); this.model.bind('refresh', this.render); this.model.bind('change', this.render); this.model.view = this; }, // Re-render the contents render: function () { var view = new QuestionView(); //{el:this.el}); for (var i = 0; i < this.model.attributes[0].questions.length; i++) { var qModel = new Question(); qModel.question = this.model.attributes[0].questions[i]; view.questionlist.add(qModel); } } }); window.App = new SurveyView(aSurvey); aSurvey.fetch(); }); -html <body> <div id="questions"></div> <!-- Templates --> <script type="text/template" id="questionAnswerOptionTemplate"> <input name="answerOptionGroup<%= questionId %>" id="answerOptionInput<%= id %>" type="checkbox" class="answerOptionControl"/> <label for="answerOptionInput<%= id %>"><%= text %></label> </script> <script type="text/template" id="questionTemplate"> <div id="question<%=id %>" class="questionWithCurve"> <h1><%= headerText %></h1> <h2><%= subText %></h2> <div data-role="fieldcontain" id="answerOptions<%= id %>" > <fieldset data-role="controlgroup" data-type="vertical"> <legend> </legend> </fieldset> </div> </div> </script> </body> And the JSON from the server: ? ({ "name": "Survey", "questions": [{ "surveyId": 1, "headerText": "Question 1", "subText": "subtext", "type": "Choice", "positionOrder": 1, "answerOptions": [{ "questionId": 1, "text": "Question 1 - Option 1", "positionOrder": 1, "id": 1, "createdOn": "\/Date(1333666034297+0100)\/" }, { "questionId": 1, "text": "Question 1 - Option 2", "positionOrder": 2, "id": 2, "createdOn": "\/Date(1333666034340+0100)\/" }, { "questionId": 1, "text": "Question 1 - Option 3", "positionOrder": 3, "id": 3, "createdOn": "\/Date(1333666034350+0100)\/" }], "questionValidators": [{ "questionId": 1, "value": "3", "type": "MaxAnswers", "id": 1, "createdOn": "\/Date(1333666034267+0100)\/" }, { "questionId": 1, "value": "1", "type": "MinAnswers", "id": 2, "createdOn": "\/Date(1333666034283+0100)\/" }], "id": 1, "createdOn": "\/Date(1333666034257+0100)\/" }, { "surveyId": 1, "headerText": "Question 2", "subText": "subtext", "type": "Choice", "positionOrder": 2, "answerOptions": [{ "questionId": 2, "text": "Question 2 - Option 1", "positionOrder": 1, "id": 4, "createdOn": "\/Date(1333666034427+0100)\/" }, { "questionId": 2, "text": "Question 2 - Option 2", "positionOrder": 2, "id": 5, "createdOn": "\/Date(1333666034440+0100)\/" }, { "questionId": 2, "text": "Question 2 - Option 3", "positionOrder": 3, "id": 6, "createdOn": "\/Date(1333666034447+0100)\/" }], "questionValidators": [{ "questionId": 2, "value": "3", "type": "MaxAnswers", "id": 3, "createdOn": "\/Date(1333666034407+0100)\/" }, { "questionId": 2, "value": "1", "type": "MinAnswers", "id": 4, "createdOn": "\/Date(1333666034417+0100)\/" }], "id": 2, "createdOn": "\/Date(1333666034377+0100)\/" }, { "surveyId": 1, "headerText": "Question 3", "subText": "subtext", "type": "Choice", "positionOrder": 3, "answerOptions": [{ "questionId": 3, "text": "Question 3 - Option 1", "positionOrder": 1, "id": 7, "createdOn": "\/Date(1333666034477+0100)\/" }, { "questionId": 3, "text": "Question 3 - Option 2", "positionOrder": 2, "id": 8, "createdOn": "\/Date(1333666034483+0100)\/" }, { "questionId": 3, "text": "Question 3 - Option 3", "positionOrder": 3, "id": 9, "createdOn": "\/Date(1333666034487+0100)\/" }], "questionValidators": [{ "questionId": 3, "value": "3", "type": "MaxAnswers", "id": 5, "createdOn": "\/Date(1333666034463+0100)\/" }, { "questionId": 3, "value": "1", "type": "MinAnswers", "id": 6, "createdOn": "\/Date(1333666034470+0100)\/" }], "id": 3, "createdOn": "\/Date(1333666034457+0100)\/" }, { "surveyId": 1, "headerText": "Question 4", "subText": "subtext", "type": "Choice", "positionOrder": 4, "answerOptions": [{ "questionId": 4, "text": "Question 4 - Option 1", "positionOrder": 1, "id": 10, "createdOn": "\/Date(1333666034500+0100)\/" }, { "questionId": 4, "text": "Question 4 - Option 2", "positionOrder": 2, "id": 11, "createdOn": "\/Date(1333666034507+0100)\/" }, { "questionId": 4, "text": "Question 4 - Option 3", "positionOrder": 3, "id": 12, "createdOn": "\/Date(1333666034507+0100)\/" }], "questionValidators": [{ "questionId": 4, "value": "3", "type": "MaxAnswers", "id": 7, "createdOn": "\/Date(1333666034493+0100)\/" }, { "questionId": 4, "value": "1", "type": "MinAnswers", "id": 8, "createdOn": "\/Date(1333666034497+0100)\/" }], "id": 4, "createdOn": "\/Date(1333666034490+0100)\/" }], "id": 1, "createdOn": "\/Date(1333666034243+0100)\/" })

    Read the article

  • Where does ASP.NET Web API Fit?

    - by Rick Strahl
    With the pending release of ASP.NET MVC 4 and the new ASP.NET Web API, there has been a lot of discussion of where the new Web API technology fits in the ASP.NET Web stack. There are a lot of choices to build HTTP based applications available now on the stack - we've come a long way from when WebForms and Http Handlers/Modules where the only real options. Today we have WebForms, MVC, ASP.NET Web Pages, ASP.NET AJAX, WCF REST and now Web API as well as the core ASP.NET runtime to choose to build HTTP content with. Web API definitely squarely addresses the 'API' aspect - building consumable services - rather than HTML content, but even to that end there are a lot of choices you have today. So where does Web API fit, and when doesn't it? But before we get into that discussion, let's talk about what a Web API is and why we should care. What's a Web API? HTTP 'APIs' (Microsoft's new terminology for a service I guess)  are becoming increasingly more important with the rise of the many devices in use today. Most mobile devices like phones and tablets run Apps that are using data retrieved from the Web over HTTP. Desktop applications are also moving in this direction with more and more online content and synching moving into even traditional desktop applications. The pending Windows 8 release promises an app like platform for both the desktop and other devices, that also emphasizes consuming data from the Cloud. Likewise many Web browser hosted applications these days are relying on rich client functionality to create and manipulate the browser user interface, using AJAX rather than server generated HTML data to load up the user interface with data. These mobile or rich Web applications use their HTTP connection to return data rather than HTML markup in the form of JSON or XML typically. But an API can also serve other kinds of data, like images or other binary files, or even text data and HTML (although that's less common). A Web API is what feeds rich applications with data. ASP.NET Web API aims to service this particular segment of Web development by providing easy semantics to route and handle incoming requests and an easy to use platform to serve HTTP data in just about any content format you choose to create and serve from the server. But .NET already has various HTTP Platforms The .NET stack already includes a number of technologies that provide the ability to create HTTP service back ends, and it has done so since the very beginnings of the .NET platform. From raw HTTP Handlers and Modules in the core ASP.NET runtime, to high level platforms like ASP.NET MVC, Web Forms, ASP.NET AJAX and the WCF REST engine (which technically is not ASP.NET, but can integrate with it), you've always been able to handle just about any kind of HTTP request and response with ASP.NET. The beauty of the raw ASP.NET platform is that it provides you everything you need to build just about any type of HTTP application you can dream up from low level APIs/custom engines to high level HTML generation engine. ASP.NET as a core platform clearly has stood the test of time 10+ years later and all other frameworks like Web API are built on top of this ASP.NET core. However, although it's possible to create Web APIs / Services using any of the existing out of box .NET technologies, none of them have been a really nice fit for building arbitrary HTTP based APIs. Sure, you can use an HttpHandler to create just about anything, but you have to build a lot of plumbing to build something more complex like a comprehensive API that serves a variety of requests, handles multiple output formats and can easily pass data up to the server in a variety of ways. Likewise you can use ASP.NET MVC to handle routing and creating content in various formats fairly easily, but it doesn't provide a great way to automatically negotiate content types and serve various content formats directly (it's possible to do with some plumbing code of your own but not built in). Prior to Web API, Microsoft's main push for HTTP services has been WCF REST, which was always an awkward technology that had a severe personality conflict, not being clear on whether it wanted to be part of WCF or purely a separate technology. In the end it didn't do either WCF compatibility or WCF agnostic pure HTTP operation very well, which made for a very developer-unfriendly environment. Personally I didn't like any of the implementations at the time, so much so that I ended up building my own HTTP service engine (as part of the West Wind Web Toolkit), as have a few other third party tools that provided much better integration and ease of use. With the release of Web API for the first time I feel that I can finally use the tools in the box and not have to worry about creating and maintaining my own toolkit as Web API addresses just about all the features I implemented on my own and much more. ASP.NET Web API provides a better HTTP Experience ASP.NET Web API differentiates itself from the previous Microsoft in-box HTTP service solutions in that it was built from the ground up around the HTTP protocol and its messaging semantics. Unlike WCF REST or ASP.NET AJAX with ASMX, it’s a brand new platform rather than bolted on technology that is supposed to work in the context of an existing framework. The strength of the new ASP.NET Web API is that it combines the best features of the platforms that came before it, to provide a comprehensive and very usable HTTP platform. Because it's based on ASP.NET and borrows a lot of concepts from ASP.NET MVC, Web API should be immediately familiar and comfortable to most ASP.NET developers. Here are some of the features that Web API provides that I like: Strong Support for URL Routing to produce clean URLs using familiar MVC style routing semantics Content Negotiation based on Accept headers for request and response serialization Support for a host of supported output formats including JSON, XML, ATOM Strong default support for REST semantics but they are optional Easily extensible Formatter support to add new input/output types Deep support for more advanced HTTP features via HttpResponseMessage and HttpRequestMessage classes and strongly typed Enums to describe many HTTP operations Convention based design that drives you into doing the right thing for HTTP Services Very extensible, based on MVC like extensibility model of Formatters and Filters Self-hostable in non-Web applications  Testable using testing concepts similar to MVC Web API is meant to handle any kind of HTTP input and produce output and status codes using the full spectrum of HTTP functionality available in a straight forward and flexible manner. Looking at the list above you can see that a lot of functionality is very similar to ASP.NET MVC, so many ASP.NET developers should feel quite comfortable with the concepts of Web API. The Routing and core infrastructure of Web API are very similar to how MVC works providing many of the benefits of MVC, but with focus on HTTP access and manipulation in Controller methods rather than HTML generation in MVC. There’s much improved support for content negotiation based on HTTP Accept headers with the framework capable of detecting automatically what content the client is sending and requesting and serving the appropriate data format in return. This seems like such a little and obvious thing, but it's really important. Today's service backends often are used by multiple clients/applications and being able to choose the right data format for what fits best for the client is very important. While previous solutions were able to accomplish this using a variety of mixed features of WCF and ASP.NET, Web API combines all this functionality into a single robust server side HTTP framework that intrinsically understands the HTTP semantics and subtly drives you in the right direction for most operations. And when you need to customize or do something that is not built in, there are lots of hooks and overrides for most behaviors, and even many low level hook points that allow you to plug in custom functionality with relatively little effort. No Brainers for Web API There are a few scenarios that are a slam dunk for Web API. If your primary focus of an application or even a part of an application is some sort of API then Web API makes great sense. HTTP ServicesIf you're building a comprehensive HTTP API that is to be consumed over the Web, Web API is a perfect fit. You can isolate the logic in Web API and build your application as a service breaking out the logic into controllers as needed. Because the primary interface is the service there's no confusion of what should go where (MVC or API). Perfect fit. Primary AJAX BackendsIf you're building rich client Web applications that are relying heavily on AJAX callbacks to serve its data, Web API is also a slam dunk. Again because much if not most of the business logic will probably end up in your Web API service logic, there's no confusion over where logic should go and there's no duplication. In Single Page Applications (SPA), typically there's very little HTML based logic served other than bringing up a shell UI and then filling the data from the server with AJAX which means the business logic required for data retrieval and data acceptance and validation too lives in the Web API. Perfect fit. Generic HTTP EndpointsAnother good fit are generic HTTP endpoints that to serve data or handle 'utility' type functionality in typical Web applications. If you need to implement an image server, or an upload handler in the past I'd implement that as an HTTP handler. With Web API you now have a well defined place where you can implement these types of generic 'services' in a location that can easily add endpoints (via Controller methods) or separated out as more full featured APIs. Granted this could be done with MVC as well, but Web API seems a clearer and more well defined place to store generic application services. This is one thing I used to do a lot of in my own libraries and Web API addresses this nicely. Great fit. Mixed HTML and AJAX Applications: Not a clear Choice  For all the commonality that Web API and MVC share they are fundamentally different platforms that are independent of each other. A lot of people have asked when does it make sense to use MVC vs. Web API when you're dealing with typical Web application that creates HTML and also uses AJAX functionality for rich functionality. While it's easy to say that all 'service'/AJAX logic should go into a Web API and all HTML related generation into MVC, that can often result in a lot of code duplication. Also MVC supports JSON and XML result data fairly easily as well so there's some confusion where that 'trigger point' is of when you should switch to Web API vs. just implementing functionality as part of MVC controllers. Ultimately there's a tradeoff between isolation of functionality and duplication. A good rule of thumb I think works is that if a large chunk of the application's functionality serves data Web API is a good choice, but if you have a couple of small AJAX requests to serve data to a grid or autocomplete box it'd be overkill to separate out that logic into a separate Web API controller. Web API does add overhead to your application (it's yet another framework that sits on top of core ASP.NET) so it should be worth it .Keep in mind that MVC can generate HTML and JSON/XML and just about any other content easily and that functionality is not going away, so just because you Web API is there it doesn't mean you have to use it. Web API is not a full replacement for MVC obviously either since there's not the same level of support to feed HTML from Web API controllers (although you can host a RazorEngine easily enough if you really want to go that route) so if you're HTML is part of your API or application in general MVC is still a better choice either alone or in combination with Web API. I suspect (and hope) that in the future Web API's functionality will merge even closer with MVC so that you might even be able to mix functionality of both into single Controllers so that you don't have to make any trade offs, but at the moment that's not the case. Some Issues To think about Web API is similar to MVC but not the Same Although Web API looks a lot like MVC it's not the same and some common functionality of MVC behaves differently in Web API. For example, the way single POST variables are handled is different than MVC and doesn't lend itself particularly well to some AJAX scenarios with POST data. Code Duplication I already touched on this in the Mixed HTML and Web API section, but if you build an MVC application that also exposes a Web API it's quite likely that you end up duplicating a bunch of code and - potentially - infrastructure. You may have to create authentication logic both for an HTML application and for the Web API which might need something different altogether. More often than not though the same logic is used, and there's no easy way to share. If you implement an MVC ActionFilter and you want that same functionality in your Web API you'll end up creating the filter twice. AJAX Data or AJAX HTML On a recent post's comments, David made some really good points regarding the commonality of MVC and Web API's and its place. One comment that caught my eye was a little more generic, regarding data services vs. HTML services. David says: I see a lot of merit in the combination of Knockout.js, client side templates and view models, calling Web API for a responsive UI, but sometimes late at night that still leaves me wondering why I would no longer be using some of the nice tooling and features that have evolved in MVC ;-) You know what - I can totally relate to that. On the last Web based mobile app I worked on, we decided to serve HTML partials to the client via AJAX for many (but not all!) things, rather than sending down raw data to inject into the DOM on the client via templating or direct manipulation. While there are definitely more bytes on the wire, with this, the overhead ended up being actually fairly small if you keep the 'data' requests small and atomic. Performance was often made up by the lack of client side rendering of HTML. Server rendered HTML for AJAX templating gives so much better infrastructure support without having to screw around with 20 mismatched client libraries. Especially with MVC and partials it's pretty easy to break out your HTML logic into very small, atomic chunks, so it's actually easy to create small rendering islands that can be used via composition on the server, or via AJAX calls to small, tight partials that return HTML to the client. Although this is often frowned upon as to 'heavy', it worked really well in terms of developer effort as well as providing surprisingly good performance on devices. There's still plenty of jQuery and AJAX logic happening on the client but it's more manageable in small doses rather than trying to do the entire UI composition with JavaScript and/or 'not-quite-there-yet' template engines that are very difficult to debug. This is not an issue directly related to Web API of course, but something to think about especially for AJAX or SPA style applications. Summary Web API is a great new addition to the ASP.NET platform and it addresses a serious need for consolidation of a lot of half-baked HTTP service API technologies that came before it. Web API feels 'right', and hits the right combination of usability and flexibility at least for me and it's a good fit for true API scenarios. However, just because a new platform is available it doesn't meant that other tools or tech that came before it should be discarded or even upgraded to the new platform. There's nothing wrong with continuing to use MVC controller methods to handle API tasks if that's what your app is running now - there's very little to be gained by upgrading to Web API just because. But going forward Web API clearly is the way to go, when building HTTP data interfaces and it's good to see that Microsoft got this one right - it was sorely needed! Resources ASP.NET Web API AspConf Ask the Experts Session (first 5 minutes) © Rick Strahl, West Wind Technologies, 2005-2012Posted in Web Api   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Wireless Connected But No Internet Connection (Ubuntu 12.04)

    - by Zxy
    I am using same network for 2 days and everything was normal. However, today even though it shows me as connected to the network, I do not have internet connection. If I use ethernet cable instead of wireless, I am still able to connect to the internet. Also my friends are able to connect to the wireless network and they can get internet connection. I did not update or install anything since yesterday. Therefore I do not have any idea why it is happening. Here is some information about my connection: I will be appreciate to any kind of help. root@ghostrider:/etc/resolvconf# ping 127.0.0.1 PING 127.0.0.1 (127.0.0.1) 56(84) bytes of data. 64 bytes from 127.0.0.1: icmp_req=1 ttl=64 time=0.042 ms 64 bytes from 127.0.0.1: icmp_req=2 ttl=64 time=0.023 ms 64 bytes from 127.0.0.1: icmp_req=3 ttl=64 time=0.036 ms 64 bytes from 127.0.0.1: icmp_req=4 ttl=64 time=0.040 ms ^C --- 127.0.0.1 ping statistics --- 4 packets transmitted, 4 received, 0% packet loss, time 2998ms rtt min/avg/max/mdev = 0.023/0.035/0.042/0.008 ms root@ghostrider:/etc/resolvconf# ping 192.168.1.3 PING 192.168.1.3 (192.168.1.3) 56(84) bytes of data. ^C --- 192.168.1.3 ping statistics --- 19 packets transmitted, 0 received, 100% packet loss, time 18143ms root@ghostrider:/etc/resolvconf# ping 8.8.8.8 PING 8.8.8.8 (8.8.8.8) 56(84) bytes of data. ^C --- 8.8.8.8 ping statistics --- 11 packets transmitted, 0 received, 100% packet loss, time 10079ms root@ghostrider:/etc/resolvconf# cat /etc/lsb-release; uname -a DISTRIB_ID=Ubuntu DISTRIB_RELEASE=12.04 DISTRIB_CODENAME=precise DISTRIB_DESCRIPTION="Ubuntu 12.04 LTS" Linux ghostrider 3.2.0-24-generic-pae #39-Ubuntu SMP Mon May 21 18:54:21 UTC 2012 i686 i686 i386 GNU/Linux root@ghostrider:/etc/resolvconf# lspci -nnk | grep -iA2 net 03:00.0 Ethernet controller [0200]: Atheros Communications Inc. AR8131 Gigabit Ethernet [1969:1063] (rev c0) Subsystem: Lenovo Device [17aa:3956] Kernel driver in use: atl1c -- 04:00.0 Network controller [0280]: Broadcom Corporation BCM4313 802.11b/g/n Wireless LAN Controller [14e4:4727] (rev 01) Subsystem: Broadcom Corporation Device [14e4:0510] Kernel driver in use: wl root@ghostrider:/etc/resolvconf# lsusb Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 001 Device 002: ID 8087:0020 Intel Corp. Integrated Rate Matching Hub Bus 002 Device 002: ID 8087:0020 Intel Corp. Integrated Rate Matching Hub Bus 001 Device 007: ID 0489:e00d Foxconn / Hon Hai Bus 001 Device 004: ID 1c7a:0801 LighTuning Technology Inc. Fingerprint Reader Bus 001 Device 005: ID 064e:f219 Suyin Corp. Bus 002 Device 010: ID 0424:2412 Standard Microsystems Corp. Bus 002 Device 004: ID 046d:c52b Logitech, Inc. Unifying Receiver Bus 002 Device 011: ID 0403:6010 Future Technology Devices International, Ltd FT2232C Dual USB-UART/FIFO IC root@ghostrider:/etc/resolvconf# iwconfig lo no wireless extensions. eth1 IEEE 802.11 ESSID:"PoliTekno" Mode:Managed Frequency:2.462 GHz Access Point: 00:16:E3:40:C3:E4 Bit Rate=54 Mb/s Tx-Power:24 dBm Retry min limit:7 RTS thr:off Fragment thr:off Power Management:off Link Quality=5/5 Signal level=-52 dBm Noise level=-97 dBm Rx invalid nwid:0 Rx invalid crypt:0 Rx invalid frag:0 Tx excessive retries:0 Invalid misc:0 Missed beacon:0 eth0 no wireless extensions. root@ghostrider:/etc/resolvconf# rfkill list all 0: brcmwl-0: Wireless LAN Soft blocked: no Hard blocked: no 1: ideapad_wlan: Wireless LAN Soft blocked: no Hard blocked: no 2: ideapad_bluetooth: Bluetooth Soft blocked: no Hard blocked: no 5: hci0: Bluetooth Soft blocked: no Hard blocked: no root@ghostrider:/etc/resolvconf# lsmod Module Size Used by nls_iso8859_1 12617 0 nls_cp437 12751 0 vfat 17308 0 fat 55605 1 vfat usb_storage 39646 0 uas 17828 0 snd_hda_codec_realtek 174055 1 rfcomm 38139 12 parport_pc 32114 0 ppdev 12849 0 bnep 17830 2 joydev 17393 0 ftdi_sio 35859 1 usbserial 37173 3 ftdi_sio snd_hda_intel 32765 3 snd_hda_codec 109562 2 snd_hda_codec_realtek,snd_hda_intel snd_hwdep 13276 1 snd_hda_codec acer_wmi 23612 0 hid_logitech_dj 18177 0 snd_pcm 80845 2 snd_hda_intel,snd_hda_codec uvcvideo 67203 0 btusb 17912 2 snd_seq_midi 13132 0 videodev 86588 1 uvcvideo bluetooth 158438 23 rfcomm,bnep,btusb psmouse 72919 0 usbhid 41906 1 hid_logitech_dj snd_rawmidi 25424 1 snd_seq_midi intel_ips 17753 0 serio_raw 13027 0 root@ghostrider:/etc/resolvconf# ping 127.0.0.1 PING 127.0.0.1 (127.0.0.1) 56(84) bytes of data. 64 bytes from 127.0.0.1: icmp_req=1 ttl=64 time=0.042 ms 64 bytes from 127.0.0.1: icmp_req=2 ttl=64 time=0.023 ms 64 bytes from 127.0.0.1: icmp_req=3 ttl=64 time=0.036 ms 64 bytes from 127.0.0.1: icmp_req=4 ttl=64 time=0.040 ms ^C --- 127.0.0.1 ping statistics --- 4 packets transmitted, 4 received, 0% packet loss, time 2998ms rtt min/avg/max/mdev = 0.023/0.035/0.042/0.008 ms root@ghostrider:/etc/resolvconf# ping 192.168.1.3 PING 192.168.1.3 (192.168.1.3) 56(84) bytes of data. ^C --- 192.168.1.3 ping statistics --- 19 packets transmitted, 0 received, 100% packet loss, time 18143ms root@ghostrider:/etc/resolvconf# ping 8.8.8.8 PING 8.8.8.8 (8.8.8.8) 56(84) bytes of data. ^C --- 8.8.8.8 ping statistics --- 11 packets transmitted, 0 received, 100% packet loss, time 10079ms root@ghostrider:/etc/resolvconf# cat /etc/lsb-release; uname -a DISTRIB_ID=Ubuntu DISTRIB_RELEASE=12.04 DISTRIB_CODENAME=precise DISTRIB_DESCRIPTION="Ubuntu 12.04 LTS" Linux ghostrider 3.2.0-24-generic-pae #39-Ubuntu SMP Mon May 21 18:54:21 UTC 2012 i686 i686 i386 GNU/Linux root@ghostrider:/etc/resolvconf# lspci -nnk | grep -iA2 net 03:00.0 Ethernet controller [0200]: Atheros Communications Inc. AR8131 Gigabit Ethernet [1969:1063] (rev c0) Subsystem: Lenovo Device [17aa:3956] Kernel driver in use: atl1c -- 04:00.0 Network controller [0280]: Broadcom Corporation BCM4313 802.11b/g/n Wireless LAN Controller [14e4:4727] (rev 01) Subsystem: Broadcom Corporation Device [14e4:0510] Kernel driver in use: wl root@ghostrider:/etc/resolvconf# lsusb Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 001 Device 002: ID 8087:0020 Intel Corp. Integrated Rate Matching Hub Bus 002 Device 002: ID 8087:0020 Intel Corp. Integrated Rate Matching Hub Bus 001 Device 007: ID 0489:e00d Foxconn / Hon Hai Bus 001 Device 004: ID 1c7a:0801 LighTuning Technology Inc. Fingerprint Reader Bus 001 Device 005: ID 064e:f219 Suyin Corp. Bus 002 Device 010: ID 0424:2412 Standard Microsystems Corp. Bus 002 Device 004: ID 046d:c52b Logitech, Inc. Unifying Receiver Bus 002 Device 011: ID 0403:6010 Future Technology Devices International, Ltd FT2232C Dual USB-UART/FIFO IC root@ghostrider:/etc/resolvconf# iwconfig lo no wireless extensions. eth1 IEEE 802.11 ESSID:"PoliTekno" Mode:Managed Frequency:2.462 GHz Access Point: 00:16:E3:40:C3:E4 Bit Rate=54 Mb/s Tx-Power:24 dBm Retry min limit:7 RTS thr:off Fragment thr:off Power Management:off Link Quality=5/5 Signal level=-52 dBm Noise level=-97 dBm Rx invalid nwid:0 Rx invalid crypt:0 Rx invalid frag:0 Tx excessive retries:0 Invalid misc:0 Missed beacon:0 eth0 no wireless extensions. root@ghostrider:/etc/resolvconf# rfkill list all 0: brcmwl-0: Wireless LAN Soft blocked: no Hard blocked: no 1: ideapad_wlan: Wireless LAN Soft blocked: no Hard blocked: no 2: ideapad_bluetooth: Bluetooth Soft blocked: no Hard blocked: no 5: hci0: Bluetooth Soft blocked: no Hard blocked: no root@ghostrider:/etc/resolvconf# lsmod Module Size Used by nls_iso8859_1 12617 0 nls_cp437 12751 0 vfat 17308 0 fat 55605 1 vfat usb_storage 39646 0 uas 17828 0 snd_hda_codec_realtek 174055 1 rfcomm 38139 12 parport_pc 32114 0 ppdev 12849 0 bnep 17830 2 joydev 17393 0 ftdi_sio 35859 1 usbserial 37173 3 ftdi_sio snd_hda_intel 32765 3 snd_hda_codec 109562 2 snd_hda_codec_realtek,snd_hda_intel snd_hwdep 13276 1 snd_hda_codec acer_wmi 23612 0 hid_logitech_dj 18177 0 snd_pcm 80845 2 snd_hda_intel,snd_hda_codec uvcvideo 67203 0 btusb 17912 2 snd_seq_midi 13132 0 videodev 86588 1 uvcvideo bluetooth 158438 23 rfcomm,bnep,btusb psmouse 72919 0 usbhid 41906 1 hid_logitech_dj snd_rawmidi 25424 1 snd_seq_midi intel_ips 17753 0 serio_raw 13027 0 hid 77367 2 hid_logitech_dj,usbhid ideapad_laptop 17890 0 sparse_keymap 13658 2 acer_wmi,ideapad_laptop lib80211_crypt_tkip 17275 0 snd_seq_midi_event 14475 1 snd_seq_midi snd_seq 51567 2 snd_seq_midi,snd_seq_midi_event wl 2646601 0 wmi 18744 1 acer_wmi i915 414672 3 drm_kms_helper 45466 1 i915 snd_timer 28931 2 snd_pcm,snd_seq mac_hid 13077 0 snd_seq_device 14172 3 snd_seq_midi,snd_rawmidi,snd_seq lib80211 14040 2 lib80211_crypt_tkip,wl drm 197692 4 i915,drm_kms_helper i2c_algo_bit 13199 1 i915 snd 62064 15 snd_hda_codec_realtek,snd_hda_intel,snd_hda_codec,snd_hwdep,snd_pcm,snd_rawmidi,snd_se q,snd_timer,snd_seq_device video 19068 1 i915 mei 36570 0 soundcore 14635 1 snd snd_page_alloc 14108 2 snd_hda_intel,snd_pcm lp 17455 0 parport 40930 3 parport_pc,ppdev,lp atl1c 36718 0 root@ghostrider:/etc/resolvconf# nm-tool NetworkManager Tool State: connected (global) - Device: eth1 [PoliTekno] ---------------------------------------------------- Type: 802.11 WiFi Driver: wl State: connected Default: yes HW Address: AC:81:12:7F:6B:B2 Capabilities: Speed: 54 Mb/s Wireless Properties WEP Encryption: yes WPA Encryption: yes WPA2 Encryption: yes Wireless Access Points (* = current AP) CnDStudios: Infra, 00:12:BF:3F:0A:8A, Freq 2412 MHz, Rate 54 Mb/s, Strength 85 WPA AIR_TIES: Infra, 00:1C:A8:6E:84:32, Freq 2462 MHz, Rate 54 Mb/s, Strength 72 WPA2 VKSS: Infra, 00:E0:4D:01:0D:47, Freq 2452 MHz, Rate 54 Mb/s, Strength 62 WPA2 PROGEDA: Infra, 00:1A:2A:60:BF:61, Freq 2462 MHz, Rate 54 Mb/s, Strength 47 WPA MobilAtolye: Infra, 72:2B:C1:65:75:3C, Freq 2422 MHz, Rate 54 Mb/s, Strength 35 WPA WPA2 AIRTIES_WAR-141: Infra, 00:1C:A8:AB:AA:48, Freq 2422 MHz, Rate 54 Mb/s, Strength 35 WPA WPA2 tilda_biri_yeni: Infra, 54:E6:FC:B0:3C:E9, Freq 2437 MHz, Rate 0 Mb/s, Strength 34 WEP *PoliTekno: Infra, 00:16:E3:40:C3:E4, Freq 2462 MHz, Rate 54 Mb/s, Strength 100 WPA2 AIRTIES_RJY: Infra, 00:1A:2A:BD:85:16, Freq 2462 MHz, Rate 54 Mb/s, Strength 55 WEP IPv4 Settings: Address: 0.0.0.0 Prefix: 24 (255.255.255.0) Gateway: 192.168.1.1 DNS: 192.168.1.1 - Device: eth0 ----------------------------------------------------------------- Type: Wired Driver: atl1c State: unavailable Default: no HW Address: F0:DE:F1:6C:90:65 Capabilities: Carrier Detect: yes Speed: 100 Mb/s Wired Properties Carrier: off root@ghostrider:/etc/resolvconf# sudo iwlist scan lo Interface doesn't support scanning. eth1 Scan completed : Cell 01 - Address: 00:16:E3:40:C3:E4 ESSID:"PoliTekno" Mode:Managed Frequency:2.462 GHz (Channel 11) Quality:5/5 Signal level:-48 dBm Noise level:-98 dBm IE: IEEE 802.11i/WPA2 Version 1 Group Cipher : CCMP Pairwise Ciphers (1) : CCMP Authentication Suites (1) : PSK Encryption key:on Bit Rates:1 Mb/s; 2 Mb/s; 5.5 Mb/s; 11 Mb/s; 18 Mb/s 24 Mb/s; 36 Mb/s; 54 Mb/s; 6 Mb/s; 9 Mb/s 12 Mb/s; 48 Mb/s Cell 02 - Address: 00:E0:4D:01:0D:47 ESSID:"VKSS" Mode:Managed Frequency:2.452 GHz (Channel 9) Quality:4/5 Signal level:-64 dBm Noise level:-98 dBm IE: IEEE 802.11i/WPA2 Version 1 Group Cipher : CCMP Pairwise Ciphers (1) : CCMP Authentication Suites (1) : PSK Encryption key:on Bit Rates:1 Mb/s; 2 Mb/s; 5.5 Mb/s; 11 Mb/s; 6 Mb/s 9 Mb/s; 12 Mb/s; 18 Mb/s; 24 Mb/s; 36 Mb/s 48 Mb/s; 54 Mb/s Cell 03 - Address: 00:1C:A8:AB:AA:48 ESSID:"AIRTIES_WAR-141" Mode:Managed Frequency:2.422 GHz (Channel 3) Quality:2/5 Signal level:-77 dBm Noise level:-95 dBm IE: IEEE 802.11i/WPA2 Version 1 Group Cipher : TKIP Pairwise Ciphers (2) : CCMP TKIP Authentication Suites (1) : PSK IE: Unknown: DDB20050F204104A0001101049001E007FC5100018DE7CF0D8B70223A62711C18926AC290E30303030303139631044000102103B0001031047001076B31BC241E953CB99C3872554425A28102100194169725469657320576972656C657373204E6574776F726B73102300074169723534343010240008312E322E302E31321042000F4154303939313131383030323832351054000800060050F20400011011000741697235343430100800020084103C000103 IE: WPA Version 1 Group Cipher : TKIP Pairwise Ciphers (2) : CCMP TKIP Authentication Suites (1) : PSK Encryption key:on Bit Rates:1 Mb/s; 2 Mb/s; 5.5 Mb/s; 11 Mb/s; 18 Mb/s 24 Mb/s; 36 Mb/s; 54 Mb/s; 6 Mb/s; 9 Mb/s 12 Mb/s; 48 Mb/s Cell 04 - Address: 72:2B:C1:65:75:3C ESSID:"MobilAtolye" Mode:Managed Frequency:2.422 GHz (Channel 3) Quality:2/5 Signal level:-78 dBm Noise level:-92 dBm IE: IEEE 802.11i/WPA2 Version 1 Group Cipher : TKIP Pairwise Ciphers (2) : TKIP CCMP Authentication Suites (1) : PSK IE: Unknown: DDA20050F204104A0001101044000102103B00010310470010BC329E001DD811B28601722BC165753C1021001D48756177656920546563686E6F6C6F6769657320436F2E2C204C74642E1023001C48756177656920576972656C6573732041636365737320506F696E74102400065254323836301042000831323334353637381054000800060050F204000110110009487561776569415053100800020084103C000100 IE: WPA Version 1 Group Cipher : TKIP Pairwise Ciphers (2) : TKIP CCMP Authentication Suites (1) : PSK Encryption key:on Bit Rates:1 Mb/s; 2 Mb/s; 5.5 Mb/s; 11 Mb/s; 9 Mb/s 18 Mb/s; 36 Mb/s; 54 Mb/s; 6 Mb/s; 12 Mb/s 24 Mb/s; 48 Mb/s Cell 05 - Address: 00:12:BF:3F:0A:8A ESSID:"CnDStudios" Mode:Managed Frequency:2.412 GHz (Channel 1) Quality:5/5 Signal level:-47 dBm Noise level:-95 dBm IE: WPA Version 1 Group Cipher : TKIP Pairwise Ciphers (1) : TKIP Authentication Suites (1) : PSK Encryption key:on Bit Rates:1 Mb/s; 2 Mb/s; 5.5 Mb/s; 11 Mb/s; 22 Mb/s 6 Mb/s; 9 Mb/s; 12 Mb/s; 18 Mb/s; 24 Mb/s 36 Mb/s; 48 Mb/s; 54 Mb/s Cell 06 - Address: 00:1C:A8:6E:84:32 ESSID:"AIR_TIES" Mode:Managed Frequency:2.462 GHz (Channel 11) Quality:5/5 Signal level:-56 dBm Noise level:-98 dBm IE: IEEE 802.11i/WPA2 Version 1 Group Cipher : CCMP Pairwise Ciphers (1) : CCMP Authentication Suites (1) : PSK Encryption key:on Bit Rates:1 Mb/s; 2 Mb/s; 5.5 Mb/s; 11 Mb/s; 22 Mb/s 6 Mb/s; 9 Mb/s; 12 Mb/s; 18 Mb/s; 24 Mb/s 36 Mb/s; 48 Mb/s; 54 Mb/s Cell 07 - Address: 54:E6:FC:B0:3C:E9 ESSID:"tilda_biri_yeni" Mode:Managed Frequency:2.437 GHz (Channel 6) Quality:1/5 Signal level:-85 dBm Noise level:-99 dBm Encryption key:on Bit Rates:1 Mb/s; 2 Mb/s; 5.5 Mb/s; 11 Mb/s; 6 Mb/s 12 Mb/s; 24 Mb/s; 36 Mb/s; 9 Mb/s; 18 Mb/s 48 Mb/s; 54 Mb/s Cell 08 - Address: 18:28:61:16:57:C3 ESSID:"obilet" Mode:Managed Frequency:2.437 GHz (Channel 6) Quality:1/5 Signal level:-88 dBm Noise level:-99 dBm IE: IEEE 802.11i/WPA2 Version 1 Group Cipher : TKIP Pairwise Ciphers (2) : CCMP TKIP Authentication Suites (1) : PSK IE: WPA Version 1 Group Cipher : TKIP Pairwise Ciphers (2) : CCMP TKIP Authentication Suites (1) : PSK Encryption key:on Bit Rates:1 Mb/s; 2 Mb/s; 5.5 Mb/s; 11 Mb/s; 18 Mb/s 24 Mb/s; 36 Mb/s; 54 Mb/s; 6 Mb/s; 9 Mb/s 12 Mb/s; 48 Mb/s Cell 09 - Address: 00:1A:2A:60:BF:61 ESSID:"PROGEDA" Mode:Managed Frequency:2.462 GHz (Channel 11) Quality:2/5 Signal level:-75 dBm Noise level:-98 dBm IE: WPA Version 1 Group Cipher : TKIP Pairwise Ciphers (1) : TKIP Authentication Suites (1) : PSK Encryption key:on Bit Rates:1 Mb/s; 2 Mb/s; 5.5 Mb/s; 11 Mb/s; 22 Mb/s 6 Mb/s; 9 Mb/s; 12 Mb/s; 18 Mb/s; 24 Mb/s 36 Mb/s; 48 Mb/s; 54 Mb/s eth0 Interface doesn't support scanning.

    Read the article

  • C# Open Source/Free Social Networking SDK

    - by Nix
    I am currently gathering some technology requirements for a site that will be a social network based. I don't want to re-invent the wheel so i am looking for some type of SDK or collection of tools that can provide me with a way of creating/managing a social network. I understand that no framework will probably fit my exact needs so I am also looking for a flexible/extendable framework. An example extension point would be allowing the user to provide sub networks, maybe a global network that could be sub classified as work and friends. Beyond that it would also be nice to somehow be able to import contacts from other networking sites (Facebook, Linked In, etc). My current technology suite will more than likely consist of the following: IIS 7.0 WCF Data Services SQL Server 2006 ASP.NET front end. So my two questions are 1) C# Open Source Social Network SDK 2) C# Open Source Social Netowrk APIs (facebook, linked in, etc) If there is any more information you may need please let me know.

    Read the article

  • C# Social Networking SDK

    - by Nix
    I am currently gathering some technology requirements for a site that will be a social network based. I don't want to re-invent the wheel so i am looking for some type of SDK or collection of tools that can provide me with a way of creating/managing a social network. I understand that no framework will probably fit my exact needs so I am also looking for a flexible/extendable framework. An example extension point would be allowing the user to provide sub networks, maybe a global network that could be sub classified as work and friends. Beyond that it would also be nice to somehow be able to import contacts from other networking sites (Facebook, Linked In, etc). My current technology suite will consist of the following: IIS 7.0 WCF Data Services SQL Server 2006 ASP.NET front end. If there is any more information you may need please let me know.

    Read the article

  • Open Source/Free Social Networking SDK

    - by Nix
    I am currently gathering some technology requirements for a site that will be a social network based. I don't want to re-invent the wheel so i am looking for some type of SDK or collection of tools that can provide me with a way of creating/managing a social network. I understand that no framework will probably fit my exact needs so I am also looking for a flexible/extensible framework. An example extension point would be allowing the user to provide sub networks, maybe a global network that could be sub classified as work and friends. Beyond that it would also be nice to somehow be able to import contacts from other networking sites (Facebook, Linked In, etc). My current technology suite will more than likely consist of the following: IIS 7.0 WCF Data Services SQL Server 2005/2008 ASP.NET front end. So my two questions are 1) C# Open Source Social Network SDK 2) C# Open Source Social Netowrk APIs (facebook, linked in, etc) If there is any more information you may need please let me know.

    Read the article

  • WPF MVVM Pattern, ViewModel DataContext question

    - by orangecl4now
    I used this side to create my demo application http://windowsclient.net/learn/video.aspx?v=314683 The site was very useful in getting my started and in their example, they created a file called EmployeeRepository.cs which appears to be the source for the data. In their example, the data was hard-wired in code. So I'm trying to learn how to get the data from a data source (like a DB). In my specific case, I want to get the data from a Microsoft Access DB. (READ ONLY, So I'll only use SELECT commands). using System.Collections.Generic; using Telephone_Directory_2010.Model; namespace Telephone_Directory_2010.DataAccess { public class EmployeeRepository { readonly List<Employee> _employees; public EmployeeRepository() { if (_employees == null) { _employees = new List<Employee>(); } _employees.Add(Employee.CreateEmployee("Student One", "IT201", "Information Technology", "IT4207", "Building1", "Room650")); _employees.Add(Employee.CreateEmployee("Student Two", "IT201", "Information Technology", "IT4207", "Building1", "Room650")); _employees.Add(Employee.CreateEmployee("Student Three", "IT201", "Information Technology", "IT4207", "Building1", "Room650")); } public List<Employee> GetEmployees() { return new List<Employee>(_employees); } } } I found another example where an Access DB is used but it doesn't comply with MVVM. So I was trying to figure out how to add the DB file to the project, how to wire it up and bind it to a listbox (i'm not that far yet). Below is my modified file using System.Collections.Generic; using Telephone_Directory_2010.Model; // integrating new code with working code using Telephone_Directory_2010.telephone2010DataSetTableAdapters; using System.Windows.Data; namespace Telephone_Directory_2010.DataAccess { public class EmployeeRepository { readonly List<Employee> _employees; // start // integrating new code with working code private telephone2010DataSet.telephone2010DataTable employeeTable; private CollectionView dataView; internal CollectionView DataView { get { if (dataView == null) { dataView = (CollectionView) CollectionViewSource.GetDefaultView(this.DataContext); } return dataView; } } public EmployeeRepository() { if (_employees == null) { _employees = new List<Employee>(); } telephone2010TableAdapter employeeTableAdapter = new telephone2010TableAdapter(); employeeTable = employeeTableAdapter.GetData(); this.DataContext = employeeTable; } public List<Employee> GetEmployees() { return new List<Employee>(_employees); } } } I get the following error messages when building Error 1 'Telephone_Directory_2010.DataAccess.EmployeeRepository' does not contain a definition for 'DataContext' and no extension method 'DataContext' accepting a first argument of type 'Telephone_Directory_2010.DataAccess.EmployeeRepository' could be found (are you missing a using directive or an assembly reference?) C:\Projects\VS2010\Telephone Directory 2010\Telephone Directory 2010\DataAccess\EmployeeRepository.cs 23 90 Telephone Directory 2010

    Read the article

  • Drawbacks of WebFormsMVP?

    - by Tony_Henrich
    While ASP.NET MVC seems to be a viable technology praised by a lot of developers, I can't seem to find enough reasons to devote my energy and time for it. The main reason is that I don't find enough .NET jobs asking for it. Companies still use WebForms and it works just fine for them. I am not self employed where I get to choose the technology I like. I would rather use my time improve my skills in SilverLight, JQuery, Javascript, SQL, LINQ.. etc. Even Photoshop! So I got interested in webformsmvp.com. I get to still use WebForms and use better testing methods. Anyone who has experience with it can tell me what they didn't like about it?

    Read the article

  • best way to authenticate and consume web service using phonegap (html5/javascript)

    - by Raiss
    I am going to develop a phonegap application which is pretty simple. I need to implement an authentication and some simple data transfer back and forth to the phone and server. I prefer to use ASP.NET as a web service and our database is MS SQL but I am not sure what approach should I take to create a secure communication between Phonegap App and webservice. The problem with a simple AJAX request is limitation in cross-domain and I’m not sure if JSONP is a good option. I was wondering if someone can tell me what technology I should use in order to make a semi secure connection which works with PhoneGap (html5, javascript ) and .Net webservice. I understand that it’s a general question but I need to know what technology is the best in such a case. thanks

    Read the article

  • Microsoft Dynamics CRM -- Do people build websites with it?

    - by Marcy Sutton
    Forgive my ignorance, but do people build websites with Microsoft Dynamics CRM? I have a potential client who says that is the technology they will use for a new web project, for which I would be doing the HTML templating. I want to learn all I can as I am new to this particular system, but I can't seem to find anything related to web building and CRM. Is it more likely the client is developing another piece of technology to work with the CRM that they are neglecting to tell us about? Any experience or insight about this process is greatly appreciated!

    Read the article

  • Websites using JEE

    - by Rob
    Hi guys, I have a simple question, but I can't find out the answer. I'm wondering if we can see that a website is built using the JEE technology, or servlets/JSP. I think it could be possible to look for specials pages from the server (404, wrong parameters, ...) in some cases, but what about the everyday use ? In fact, I look for a collection of great (or wide used) website using the java technology, and I can't really find a list of these. I'llbe very happy if you can help me with these two small questions

    Read the article

  • Free hosting service for private and public git repositories

    - by Alexander
    Hi, does anyone know a free service for hosting private and public git repositories? There are a lot of services like for example the well known github. Most of them only allow hosting of public repositories. I want to host one or more of my private programming projects using git, but not all of them should be public (at least not for now). I also found the free service GitFarm which is build using the Google App Engine technology, but i couldn't find any information how it works (don't know what "built on Google App Engine technology" means) or if there are any other limitations. Also it seams like there is no web front-end available. An integrated web front-end, bug tracker and stuff like this would also be a big plus!

    Read the article

  • What does N years of experience with a language really mean?

    - by marcgg
    I've been looking at jobs descriptions since I'm graduating soon and looking for a job and what's always coming back - I'm not teaching you anything - is the "N years of experience in this language". It has been discussed in this question that if you work professionally with let's say Ruby for 2 years, but during these two years you also did some C# and PHP and were actually coding in Ruby 50% of the time. Do you say you have 1 year of experience in Ruby? 2 years? Another issue that hasn't been reviewed in the other post is for "non-professional experience". I'll give you a personal example: I've been working with Ruby on Rails since 2004 while at school. I did a lot of personal projects and school projects using this technology. I also used Rails in 2 6-month internships. Do I have 5 years of Rails experience (2004-now)? Do I have 1 year(2 internships)? Do I have nothing? I feel like I don't deserve the credit for 5 years, because the first years I wasn't working a lot with rails, but since last year I launched some websites and invested myself a lot in this technology and just saying 1 year doesn't really reflect how much I know the technology... Another example: I Learned C++ at school and did 1 big project with it (2-3 month of work and a semester of classes). I never used it in a company but I'd be able to be productive fairly quickly if I had to work on a C++ project and I have a good grasp of the concepts. Do I have no experience? 3 months? 6 months? ... something else? What I'm really trying to do is to find a way to present my skill set in a way that is compliant to what recruiters expect. I also don't want to end up at an interview that would go something like this... Recruiter (finding out the horrible truth): Oh but you said that you had 2 years of experience with this when you have none! / slaps me in the face / Me (in pain): Oh! The irony! Recruiter (yelling): Get out of my office / calls security, punches me in the throat /

    Read the article

< Previous Page | 117 118 119 120 121 122 123 124 125 126 127 128  | Next Page >