Search Results

Search found 41511 results on 1661 pages for 'via point'.

Page 17/1661 | < Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >

  • Sharing a mounted Truecrypt volume via Samba

    - by user10492
    Been banging my head against the wall on this one for a while. I have an encrypted (via truecrypt) partition on a drive. In windows, I mount it locally and share it on my local network. I'm trying to do the same in Ubuntu 10.10 but am running into permission issues. The tc volume has funny permissions and I just can't seem to figure out a way to access it (unsecurely of course) over my network. Other non-tc volume shares work just fine. To recap, I mount a truecrypt volume in Ubuntu and set up a samba share as normal. The share shows up on my local network but accessing it gives 'permission denied'. Mounting as a network drive using a password does not seem to work either. Any insight would be greatly appreciated.

    Read the article

  • video/audio output via HDMI Ubuntu 12.04

    - by lostNfound
    I've been out of the Ubuntu loop for quite a while now and have a completely new laptop now. Just installed Ubuntu 12.04 64-bit and would like to output my video and my audio via HDMI to my television. the following is the lspci | grep VGA for my computer. please tell me if there is any additional information needed and preferably how to obtain it and i will be more than happy to oblige. thank you in advance for your time and assistance in this matter. 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 01:00.0 VGA compatible controller: NVIDIA Corporation GF108 [GeForce GT 540M] (rev a1) Edit: every time i restart my computer, after a short moment, i get an error message stating something along the lines "sorry, jockey needed to close unexpectedly." after researching, i discovered jockey is the name of the "additional drivers," which after initial installation, ubuntu informed me of proprietary drivers available. those are no longer available, and this error continues to occur.

    Read the article

  • URL rewriting via forward proxy

    - by Biggroover
    I have an app that runs inside my firewall and talks out to multiple end points via HTTP/HTTPS on a non-standard port e.g. http://endpoint1.domain.com:7171, http://endpoint2.domain.com:7171 What I want to do is route these requests through a forward proxy that then rewrites the URL to something like http://allendpoints.domain.com/endpoint1 (port 80 or 443) then on the other end have a reverse proxy that unwinds what I did on the forward proxy to reach the specific endpoints. The result being that I can route existing app requests through to specific endpoints across the internet without having to change my app software. My questions are: is this even possible? is it a good idea, are their better ways to do this? Can this be done with IIS and Apache as the proxies?

    Read the article

  • Ad networks that will serve via HTTPS?

    - by Dogweather
    I've built a website with 160K page views per month that serves every page over HTTPS. The recent FireSheep news will probably increase the adoption of "HTTPS everywhere" but it's been very hard to find ad networks and affiliates that will serve their content via HTTPS. I don't want to use these because I don't want my visitors to get "broken security" notification from their browsers (and of course, relevant ads would be a leak of private information). I'm tired of spending a ton of time signing up with ad networks and affiliates only to find out down the road that they don't support HTTPS (e.g. AdSense). Can anyone suggest any options or provide a pointer to a list of these somewhere?

    Read the article

  • After 10.10 -> 11.04 upgrade, can only login via Classic (No Effects)

    - by Ryan P.
    Yesterday I upgraded from 10.10 to 11.04, everything seemed to go okay until immediately after login: the desktop goes into a "corrupted" looking state (similar to having too high resolution set). I can see some kind of movement by moving the mouse around/right clicking, and can enter text terminals via ctrl + alt + f1 It does this in both plain "Ubuntu" and "Ubuntu Classic", and only seems to login/startup properly with Ubuntu Classic (No Effects). I have checked my video card (Radeon X600) and run the unity support test which passes with all "yes" results (Unity supported: yes): /usr/lib/nux/unity_support_test -p I have tried re-installing my Ubuntu desktop: rm -rf .gnome .gnome2 .gconf .gconfd .metacity sudo apt-get remove ubuntu-desktop sudo apt-get install ubuntu-desktop With no success. I can workaround for now with Classic (No Effects), but I'd really like to find the root problem. Any suggestions on what else to try would be appreciated!

    Read the article

  • Connecting Ubuntu home server to internet via laptop

    - by Gray-Wolfe
    I recently got a spare desktop computer from a relative to play around with and since I've been using Ubuntu for a few years I decided to install Ubuntu Server 12.04 to play around with and learn more. However it doesn't have a wireless card, the wireless adapter had a shipping error and will be awhile, and plugging it into the router is not an option. So I figured I could give it access to the wifi from my laptop(which I switch between Windows and Ubuntu) so I could at least get some things started and set up while I'm waiting for the adapter. However the few guides I can find to do that require a GUI, something lacking on the server version. Could someone tell me how to set this up via the terminal? I would appreciate it.

    Read the article

  • Auto-provisioning hosting via API

    - by user101289
    I've built a sort of 'software as a service' website package for a specific industry. What I am looking to do is create a payment gateway that allows users to subscribe-- and once the subscription is active, it would auto-provision a web hosting plan for them (a shared account on a server, probably in a chroot'd environment so each user would be insulated from others). Ideally it would auto-install a CMS as well. Tons of web hosts provide a simple reseller plan where I could manually create all the users' hosting accounts-- but so far none that I've found allow you to do this via API. Is there a way to do this short of writing custom shell scripts on something like an EC2 platform? I'd prefer to leave all the server maintenance in the hands of dedicated support staff rather than having to manually handle updates, backups, etc. Thanks for any tips.

    Read the article

  • After 10.10 -> 11.04 upgrade, can only login via Classic (No Effects)

    - by Ryan P.
    Yesterday I upgraded from 10.10 to 11.04, everything seemed to go okay until immediately after login: the desktop goes into a "corrupted" looking state (similar to having too high resolution set). I can see some kind of movement by moving the mouse around/right clicking, and can enter text terminals via ctrl + alt + f1 It does this in both plain "Ubuntu" and "Ubuntu Classic", and only seems to login/startup properly with Ubuntu Classic (No Effects). I have checked my video card (Radeon X600) and run the unity support test which passes with all "yes" results (Unity supported: yes): /usr/lib/nux/unity_support_test -p I have tried re-installing my Ubuntu desktop: rm -rf .gnome .gnome2 .gconf .gconfd .metacity sudo apt-get remove ubuntu-desktop sudo apt-get install ubuntu-desktop With no success. I can workaround for now with Classic (No Effects), but I'd really like to find the root problem. Any suggestions on what else to try would be appreciated!

    Read the article

  • CSS not loading when site is viewed via Windows VPN

    - by Dreamling
    Internal site has recently been redesigned, but IE8 does not seem to be loading the new css rules only when viewed via VPN. I really have no clue what to look for. I can't reproduce the problem, but it's apparently affecting client for the last month. I've suggested: Reloading IE8 Checking Internet Permissions Flushing the cache I'm not really certain what direction to search for the answer. Is it likely to be a server permissions issue? a VPN connection issue? a rare ie8 CSS bug?

    Read the article

  • Mouse Not Detected & Network Not Connected After Installing Ubuntu 12.04 Desktop via Live USB

    - by albus_severus
    I just recently (+- 30 minutes ago) install Ubuntu 12.04 Desktop on my pc (dual boot) via Live USB (i check the "automatically install upgrade" option when installing it). unfortunately, at the login screen after the installation finished (after reboot), i cannot use my mouse! also, an error message occur saying that network connection is not available. but, when i restart again and using the "try Ubuntu without installing" in the Live USB, the problems didn't occur. i tried to googling this but failed to find any solution. and yes, i am totally green on Ubuntu and Linux. so, please, help me on this. thanks in advance.

    Read the article

  • New Window Via JavaScript Clears Parent

    - by Bunch
    This is not a new item at all but I came across it recently. For an app I had been using some JavaScript like: javascript:window.open(someurl.aspx here) to open a new window via a button. That bit of code had been working great in several other apps. Then in one app that same code decided to open the new window correctly while clearing the parent of everything but [object]. The fix ended up being simple, change the javascript to: javascript:void(window.open(someurl.aspx here); Then it worked like I thought it should. Tags: ASP.Net, JavaScript

    Read the article

  • Can't Make Changes via FTP on my Ubuntu Server

    - by Rev
    I'm very new to server management, I literally just set this up for the first time a couple weeks ago, no idea what I'm doing. In order to allow WordPress to run updates, apparently everything needs to be owned by www-data, but if everything is owned by www-data, I can't FTP with the user revxx14. When I try, I can't make any changes (deletions, additions, updating files, nothing. I get permission denied errors across the board). Is there a way to give my user the same permissions as www-data, so that I'll be able to keep www-data the owner, but still be able to make changes via FTP? Thanks.

    Read the article

  • iPhone SDK Point to a specific location

    - by lnetanel
    Hi, I'm trying to develop an application that use the GPS and Compass of the iPhone in order to point some sort of pointer to a specific location (like the compass always point to the North). The location is fixed and I always need the pointer to point to that specific location no matter where the user is located. I have the Lat/Long coordinates of this location but not sure how can I point to that location using the Compass and the GPS... any help will be appreciated. Netanel

    Read the article

  • Sliding Response after a Point-Square Collision

    - by mars
    In general terms and pseudo-code, what would be the best way to have a collision response of sliding along a wall if the wall is actually just a part of an entire square that a point is colliding into? The collision test method used is a test to see if the point lies in the square. Should I divide the square into four lines and just calculate the shortest distance to the line and then move the point back that distance?If so, then how can I determine which edge of the square the point is closest to after collision?

    Read the article

  • Align 2 sets of 2D point clouds

    - by user108088
    From what I gather, there are two major methods to perform alignment on point clouds, Iterative Closest Point (ICP) and Particle Filtering. What are the advantages of each method? And can someone point me some good tutorials. For what I am currently doing, I think ICP would be easier, I can't seem to find any simple reference implementations online for 2d point sets. Has anyone seen (psuedo)code for ICP with details on the transformation step? Thanks in advance.

    Read the article

  • Finding whether a point lies inside a rectangle or not

    - by avd
    The rectangle can be oriented in any way...need not be axis aligned. Now I want to find whether a point lies inside the rectangle or not. One method I could think of was to rotate the rectangle and point coordinates to make the rectangle axis aligned and then by simply testing the coordinates of point whether they lies within that of rectangle's or not. The above method requires rotation and hence floating point operations. Is there any other efficient way to do this??

    Read the article

  • Augmenting your Social Efforts via Data as a Service (DaaS)

    - by Mike Stiles
    The following is the 3rd in a series of posts on the value of leveraging social data across your enterprise by Oracle VP Product Development Don Springer and Oracle Cloud Data and Insight Service Sr. Director Product Management Niraj Deo. In this post, we will discuss the approach and value of integrating additional “public” data via a cloud-based Data-as-as-Service platform (or DaaS) to augment your Socially Enabled Big Data Analytics and CX Management. Let’s assume you have a functional Social-CRM platform in place. You are now successfully and continuously listening and learning from your customers and key constituents in Social Media, you are identifying relevant posts and following up with direct engagement where warranted (both 1:1, 1:community, 1:all), and you are starting to integrate signals for communication into your appropriate Customer Experience (CX) Management systems as well as insights for analysis in your business intelligence application. What is the next step? Augmenting Social Data with other Public Data for More Advanced Analytics When we say advanced analytics, we are talking about understanding causality and correlation from a wide variety, volume and velocity of data to Key Performance Indicators (KPI) to achieve and optimize business value. And in some cases, to predict future performance to make appropriate course corrections and change the outcome to your advantage while you can. The data to acquire, process and analyze this is very nuanced: It can vary across structured, semi-structured, and unstructured data It can span across content, profile, and communities of profiles data It is increasingly public, curated and user generated The key is not just getting the data, but making it value-added data and using it to help discover the insights to connect to and improve your KPIs. As we spend time working with our larger customers on advanced analytics, we have seen a need arise for more business applications to have the ability to ingest and use “quality” curated, social, transactional reference data and corresponding insights. The challenge for the enterprise has been getting this data inline into an easily accessible system and providing the contextual integration of the underlying data enriched with insights to be exported into the enterprise’s business applications. The following diagram shows the requirements for this next generation data and insights service or (DaaS): Some quick points on these requirements: Public Data, which in this context is about Common Business Entities, such as - Customers, Suppliers, Partners, Competitors (all are organizations) Contacts, Consumers, Employees (all are people) Products, Brands This data can be broadly categorized incrementally as - Base Utility data (address, industry classification) Public Master Reference data (trade style, hierarchy) Social/Web data (News, Feeds, Graph) Transactional Data generated by enterprise process, workflows etc. This Data has traits of high-volume, variety, velocity etc., and the technology needed to efficiently integrate this data for your needs includes - Change management of Public Reference Data across all categories Applied Big Data to extract statics as well as real-time insights Knowledge Diagnostics and Data Mining As you consider how to deploy this solution, many of our customers will be using an online “cloud” service that provides quality data and insights uniformly to all their necessary applications. In addition, they are requesting a service that is: Agile and Easy to Use: Applications integrated with the service can obtain data on-demand, quickly and simply Cost-effective: Pre-integrated into applications so customers don’t have to Has High Data Quality: Single point access to reference data for data quality and linkages to transactional, curated and social data Supports Data Governance: Becomes more manageable and cost-effective since control of data privacy and compliance can be enforced in a centralized place Data-as-a-Service (DaaS) Just as the cloud has transformed and now offers a better path for how an enterprise manages its IT from their infrastructure, platform, and software (IaaS, PaaS, and SaaS), the next step is data (DaaS). Over the last 3 years, we have seen the market begin to offer a cloud-based data service and gain initial traction. On one side of the DaaS continuum, we see an “appliance” type of service that provides a single, reliable source of accurate business data plus social information about accounts, leads, contacts, etc. On the other side of the continuum we see more of an online market “exchange” approach where ISVs and Data Publishers can publish and sell premium datasets within the exchange, with the exchange providing a rich set of web interfaces to improve the ease of data integration. Why the difference? It depends on the provider’s philosophy on how fast the rate of commoditization of certain data types will occur. How do you decide the best approach? Our perspective, as shown in the diagram below, is that the enterprise should develop an elastic schema to support multi-domain applicability. This allows the enterprise to take the most flexible approach to harness the speed and breadth of public data to achieve value. The key tenet of the proposed approach is that an enterprise carefully federates common utility, master reference data end points, mobility considerations and content processing, so that they are pervasively available. One way you may already be familiar with this approach is in how you do Address Verification treatments for accounts, contacts etc. If you design and revise this service in such a way that it is also easily available to social analytic needs, you could extend this to launch geo-location based social use cases (marketing, sales etc.). Our fundamental belief is that value-added data achieved through enrichment with specialized algorithms, as well as applying business “know-how” to weight-factor KPIs based on innovative combinations across an ever-increasing variety, volume and velocity of data, will be where real value is achieved. Essentially, Data-as-a-Service becomes a single entry point for the ever-increasing richness and volume of public data, with enrichment and combined capabilities to extract and integrate the right data from the right sources with the right factoring at the right time for faster decision-making and action within your core business applications. As more data becomes available (and in many cases commoditized), this value-added data processing approach will provide you with ongoing competitive advantage. Let’s look at a quick example of creating a master reference relationship that could be used as an input for a variety of your already existing business applications. In phase 1, a simple master relationship is achieved between a company (e.g. General Motors) and a variety of car brands’ social insights. The reference data allows for easy sort, export and integration into a set of CRM use cases for analytics, sales and marketing CRM. In phase 2, as you create more data relationships (e.g. competitors, contacts, other brands) to have broader and deeper references (social profiles, social meta-data) for more use cases across CRM, HCM, SRM, etc. This is just the tip of the iceberg, as the amount of master reference relationships is constrained only by your imagination and the availability of quality curated data you have to work with. DaaS is just now emerging onto the marketplace as the next step in cloud transformation. For some of you, this may be the first you have heard about it. Let us know if you have questions, or perspectives. In the meantime, we will continue to share insights as we can.Photo: Erik Araujo, stock.xchng

    Read the article

  • Rendering Flickr Cats Via Backbone.js

    - by Geertjan
    Create a JavaScript file and refer to it inside an HTML file. Then put this into the JavaScript file: (function($) {     var CatCollection = Backbone.Collection.extend({         url: 'http://api.flickr.com/services/feeds/photos_public.gne?tags=cat&tagmode=any&format=json&jsoncallback=?',         parse: function(response) {             return response.items;         }     });     var CatView = Backbone.View.extend({         el: $('body'),         initialize: function() {             _.bindAll(this, 'render');             carCollectionInstance.fetch({                 success: function(response, xhr) {                     catView.render();                 }             });         },         render: function() {             $(this.el).append("<ul></ul>");             for (var i = 0; i < carCollectionInstance.length; i++) {                 $('ul', this.el).append("<li>" + i + carCollectionInstance.models[i].get("description") + "</li>");             }         }     });     var carCollectionInstance = new CatCollection();     var catView = new CatView(); })(jQuery); Apologies for any errors or misused idioms. It's my second day with Backbone.js, in fact, my second day with JavaScript. I haven't seen anywhere online so far where an example such as the above is found, though plenty that do kind of or pieces of the above, or explain in text, without an actual full example. The next step, and the only reason for the above experiment, is to create some JPA entities and expose them via RESTful webservices created on EJB methods, for consumption into an HTML5 application via a Backbone.js script very similar to the above. 

    Read the article

  • Printing to Power point

    - by manojpcw
    Hi, Similar to the print to pdf option, where we can choose PDF to be the output format in the print dialog box when printing something from a browser or other applications, I am searching for something which can print to a Power Point file. Is there any such plugin or tool? Also link to a relilable print to pdf tool would be helpful. This essentially would eliminate the export to power point option that the users are asking for in my Silverlight application. Thanks...

    Read the article

  • Parallelism in .NET – Part 11, Divide and Conquer via Parallel.Invoke

    - by Reed
    Many algorithms are easily written to work via recursion.  For example, most data-oriented tasks where a tree of data must be processed are much more easily handled by starting at the root, and recursively “walking” the tree.  Some algorithms work this way on flat data structures, such as arrays, as well.  This is a form of divide and conquer: an algorithm design which is based around breaking up a set of work recursively, “dividing” the total work in each recursive step, and “conquering” the work when the remaining work is small enough to be solved easily. Recursive algorithms, especially ones based on a form of divide and conquer, are often a very good candidate for parallelization. This is apparent from a common sense standpoint.  Since we’re dividing up the total work in the algorithm, we have an obvious, built-in partitioning scheme.  Once partitioned, the data can be worked upon independently, so there is good, clean isolation of data. Implementing this type of algorithm is fairly simple.  The Parallel class in .NET 4 includes a method suited for this type of operation: Parallel.Invoke.  This method works by taking any number of delegates defined as an Action, and operating them all in parallel.  The method returns when every delegate has completed: Parallel.Invoke( () => { Console.WriteLine("Action 1 executing in thread {0}", Thread.CurrentThread.ManagedThreadId); }, () => { Console.WriteLine("Action 2 executing in thread {0}", Thread.CurrentThread.ManagedThreadId); }, () => { Console.WriteLine("Action 3 executing in thread {0}", Thread.CurrentThread.ManagedThreadId); } ); .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } Running this simple example demonstrates the ease of using this method.  For example, on my system, I get three separate thread IDs when running the above code.  By allowing any number of delegates to be executed directly, concurrently, the Parallel.Invoke method provides us an easy way to parallelize any algorithm based on divide and conquer.  We can divide our work in each step, and execute each task in parallel, recursively. For example, suppose we wanted to implement our own quicksort routine.  The quicksort algorithm can be designed based on divide and conquer.  In each iteration, we pick a pivot point, and use that to partition the total array.  We swap the elements around the pivot, then recursively sort the lists on each side of the pivot.  For example, let’s look at this simple, sequential implementation of quicksort: public static void QuickSort<T>(T[] array) where T : IComparable<T> { QuickSortInternal(array, 0, array.Length - 1); } private static void QuickSortInternal<T>(T[] array, int left, int right) where T : IComparable<T> { if (left >= right) { return; } SwapElements(array, left, (left + right) / 2); int last = left; for (int current = left + 1; current <= right; ++current) { if (array[current].CompareTo(array[left]) < 0) { ++last; SwapElements(array, last, current); } } SwapElements(array, left, last); QuickSortInternal(array, left, last - 1); QuickSortInternal(array, last + 1, right); } static void SwapElements<T>(T[] array, int i, int j) { T temp = array[i]; array[i] = array[j]; array[j] = temp; } Here, we implement the quicksort algorithm in a very common, divide and conquer approach.  Running this against the built-in Array.Sort routine shows that we get the exact same answers (although the framework’s sort routine is slightly faster).  On my system, for example, I can use framework’s sort to sort ten million random doubles in about 7.3s, and this implementation takes about 9.3s on average. Looking at this routine, though, there is a clear opportunity to parallelize.  At the end of QuickSortInternal, we recursively call into QuickSortInternal with each partition of the array after the pivot is chosen.  This can be rewritten to use Parallel.Invoke by simply changing it to: // Code above is unchanged... SwapElements(array, left, last); Parallel.Invoke( () => QuickSortInternal(array, left, last - 1), () => QuickSortInternal(array, last + 1, right) ); } This routine will now run in parallel.  When executing, we now see the CPU usage across all cores spike while it executes.  However, there is a significant problem here – by parallelizing this routine, we took it from an execution time of 9.3s to an execution time of approximately 14 seconds!  We’re using more resources as seen in the CPU usage, but the overall result is a dramatic slowdown in overall processing time. This occurs because parallelization adds overhead.  Each time we split this array, we spawn two new tasks to parallelize this algorithm!  This is far, far too many tasks for our cores to operate upon at a single time.  In effect, we’re “over-parallelizing” this routine.  This is a common problem when working with divide and conquer algorithms, and leads to an important observation: When parallelizing a recursive routine, take special care not to add more tasks than necessary to fully utilize your system. This can be done with a few different approaches, in this case.  Typically, the way to handle this is to stop parallelizing the routine at a certain point, and revert back to the serial approach.  Since the first few recursions will all still be parallelized, our “deeper” recursive tasks will be running in parallel, and can take full advantage of the machine.  This also dramatically reduces the overhead added by parallelizing, since we’re only adding overhead for the first few recursive calls.  There are two basic approaches we can take here.  The first approach would be to look at the total work size, and if it’s smaller than a specific threshold, revert to our serial implementation.  In this case, we could just check right-left, and if it’s under a threshold, call the methods directly instead of using Parallel.Invoke. The second approach is to track how “deep” in the “tree” we are currently at, and if we are below some number of levels, stop parallelizing.  This approach is a more general-purpose approach, since it works on routines which parse trees as well as routines working off of a single array, but may not work as well if a poor partitioning strategy is chosen or the tree is not balanced evenly. This can be written very easily.  If we pass a maxDepth parameter into our internal routine, we can restrict the amount of times we parallelize by changing the recursive call to: // Code above is unchanged... SwapElements(array, left, last); if (maxDepth < 1) { QuickSortInternal(array, left, last - 1, maxDepth); QuickSortInternal(array, last + 1, right, maxDepth); } else { --maxDepth; Parallel.Invoke( () => QuickSortInternal(array, left, last - 1, maxDepth), () => QuickSortInternal(array, last + 1, right, maxDepth)); } We no longer allow this to parallelize indefinitely – only to a specific depth, at which time we revert to a serial implementation.  By starting the routine with a maxDepth equal to Environment.ProcessorCount, we can restrict the total amount of parallel operations significantly, but still provide adequate work for each processing core. With this final change, my timings are much better.  On average, I get the following timings: Framework via Array.Sort: 7.3 seconds Serial Quicksort Implementation: 9.3 seconds Naive Parallel Implementation: 14 seconds Parallel Implementation Restricting Depth: 4.7 seconds Finally, we are now faster than the framework’s Array.Sort implementation.

    Read the article

  • 550 “Overwrite permission denied” when editing a file via FTP

    - by nodebunny
    DreamHost recently moved my accounts to a new shared box, and now I can't edit files via UltraEdit's built in FTP client, which messes up my work flow! What did they do that this is not working now? It stopped working after they moved me. Here's the output from the FTP console in UltraEdit 10/26/2011 10:42:36 AM: 220 DreamHost FTP Server 10/26/2011 10:42:36 AM: USER nodebunny 10/26/2011 10:42:36 AM: 331 Password required for ninjawww 10/26/2011 10:42:36 AM: PASS xxxxxxxx 10/26/2011 10:42:36 AM: 230 User nodebunny logged in 10/26/2011 10:42:36 AM: FEAT 10/26/2011 10:42:36 AM: 211-Features: LANG ja-JP.UTF-8;ja-JP;zh-TW;fr-FR;zh-CN;en-US*;bg-BG;ko-KR.UTF-8;ko-KR MDTM MFMT TVFS UTF8 MFF modify;UNIX.group;UNIX.mode; MLST modify*;perm*;size*;type*;unique*;UNIX.group*;UNIX.mode*;UNIX.owner*; REST STREAM SIZE 211 End 10/26/2011 10:42:36 AM: OPTS UTF8 ON 10/26/2011 10:42:36 AM: 200 UTF8 set to on 10/26/2011 10:42:36 AM: PWD 10/26/2011 10:42:36 AM: 257 "/" is the current directory 10/26/2011 10:42:36 AM: PWD 10/26/2011 10:42:36 AM: 257 "/" is the current directory 10/26/2011 10:42:36 AM: CWD /dev/proj/nodebunny 10/26/2011 10:42:36 AM: 250 CWD command successful 10/26/2011 10:42:36 AM: PWD 10/26/2011 10:42:36 AM: 257 "/dev/proj/nodebunny/lib/Buffer" is the current directory 10/26/2011 10:42:36 AM: PWD 10/26/2011 10:42:37 AM: 257 "/dev/proj/nodebunny/lib/Buffer" is the current directory 10/26/2011 10:42:37 AM: TYPE I 10/26/2011 10:42:37 AM: 200 Type set to I 10/26/2011 10:42:37 AM: PORT 10,15,55,125,226,16 10/26/2011 10:42:37 AM: 200 PORT command successful 10/26/2011 10:42:37 AM: STOR Buffer.pm 10/26/2011 10:42:37 AM: 550 Buffer.pm: Overwrite permission denied

    Read the article

  • Loss of iPod nano connection via VirtualBox and iTunes

    - by user69245
    I use VirtualBox 4.12 with Ubuntu 12.04, and up until a couple of days ago I was able to sync my iPod nano, my iPod Classic and my iPad via Windows XP and iTunes. A few days ago, when I connected the iPod nano, iTunes told me it was in recovery and I'd have to Restore it. I did this, but when it started up again I was told it was still in recovery. I went through this loop a few times, and then I got it to work again, and synched successfully. Since then, I have been unable to sync my iPod nano, although both iPod Classic and iPad work fine. I have successfully synched the iPod on a laptop with Windows XP since then, but each time I connect it through VirtualBox I get the same error message. I have tried disabling the iPod service in (the VirtualBox version of) Windows, and find that the iPod isn't being mounted as a disc drive, which is what happens in Ubuntu, and on my Windows XP laptop. I've tried changing USB leads, and I've also booted Windows XP on my Ubuntu machine, and here iTunes recognises the iPod nano. As far as I know, I made no changes to the configuration of VirtualBox between the time it was synching satisfactorily and the time it failed. I don't think iTunes is at fault here, as the same version of iTunes syncs with the iPod nano on two different computers running Windows XP - I think it may be the way XP running in VirtualBox handles the mounting of the iPod. Any suggestions?

    Read the article

  • C# via Java: Introduction

    - by simonc
    Originally posted on: http://geekswithblogs.net/simonc/archive/2013/11/08/c-via-java-introduction.aspxSo, I've recently changed jobs. Rather than working in .NET land, I've migrated over to Java land. But never fear! I'll continue to peer under the covers of .NET, but my next series will use my new experience in Java to explore the design decisions made in the development of the C# programming language. After all, the design of C# was based on Java 1.2, and both languages have continued to evolve since then, incorporating modern software engineering concepts and requirements. Exploring the differences and similarities between the two will (hopefully) give us a deeper understanding into why .NET is implemented the way it is, the trade-offs involved, and what choices were made when new features were designed and added to the language and framework. Among others, I'll be looking at differences in: Primitives Operators Generics Exceptions Accessibility Collections Delegates and inner classes Concurrency In my next post, I'll start off by looking at the type primitives available in each language, and how Java and C# actually incorporate two different concepts of primitive types in their fundamental language design and use. I'm also thinking of looking at the inner details of Java and the JVM in my blogs, as well as C# and the CLR. If you've got any comments or thoughts on this, please let me know.

    Read the article

< Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >