Search Results

Search found 22762 results on 911 pages for 'wcf client'.

Page 649/911 | < Previous Page | 645 646 647 648 649 650 651 652 653 654 655 656  | Next Page >

  • Rebuilding CoasterBuzz, Part II: Hot data objects

    - by Jeff
    This is the second post, originally from my personal blog, in a series about rebuilding one of my Web sites, which has been around for 12 years. More: Part I: Evolution, and death to WCF After the rush to get moving on stuff, I temporarily lost interest. I went almost two weeks without touching the project, in part because the next thing on my backlog was doing up a bunch of administrative pages. So boring. Unfortunately, because most of the site's content is user-generated, you need some facilities for editing data. CoasterBuzz has a database full of amusement parks and roller coasters. The entities enjoy the relationships that you would expect, though they're further defined by "instances" of a coaster, to define one that has moved between parks as one, with different names and operational dates. And of course, there are pictures and news items, too. It's not horribly complex, except when you have to account for a name change and display just the newest name. In all previous versions, data access was straight SQL. As so much of the old code was rooted in 2003, with some changes in 2008, there wasn't much in the way of ORM frameworks going on then. Let me rephrase that, I mostly wasn't interested in ORM's. Since that time, I used a little LINQ to SQL in some projects, and a whole bunch of nHibernate while at Microsoft. Through all of that experience, I have to admit that these frameworks are often a bigger pain in the ass than not. They're great for basic crud operations, but when you start having all kinds of exotic relationships, they get difficult, and generate all kinds of weird SQL under the covers. The black box can quickly turn into a black hole. Sometimes you end up having to build all kinds of new expertise to do things "right" with a framework. Still, despite my reservations, I used the newer version of Entity Framework, with the "code first" modeling, in a science project and I really liked it. Since it's just a right-click away with NuGet, I figured I'd give it a shot here. My initial effort was spent defining the context class, which requires a bit of work because I deviate quite a bit from the conventions that EF uses, starting with table names. Then throw some partial querying of certain tables (where you'll find image data), and you're splitting tables across several objects (navigation properties). I won't go into the details, because these are all things that are well documented around the Internet, but there was a minor learning curve there. The basics of reading data using EF are fantastic. For example, a roller coaster object has a park associated with it, as well as a number of instances (if it was ever relocated), and there also might be a big banner image for it. This is stupid easy to use because it takes one line of code in your repository class, and by the time you pass it to the view, you have a rich object graph that has everything you need to display stuff. Likewise, editing simple data is also, well, simple. For this goodness, thank the ASP.NET MVC framework. The UpdateModel() method on the controllers is very elegant. Remember the old days of assigning all kinds of properties to objects in your Webforms code-behind? What a time consuming mess that used to be. Even if you're not using an ORM tool, having hydrated objects come off the wire is such a time saver. Not everything is easy, though. When you have to persist a complex graph of objects, particularly if they were composed in the user interface with all kinds of AJAX elements and list boxes, it's not just a simple matter of submitting the form. There were a few instances where I ended up going back to "old-fashioned" SQL just in the interest of time. It's not that I couldn't do what I needed with EF, it's just that the efficiency, both my own and that of the generated SQL, wasn't good. Since EF context objects expose a database connection object, you can use that to do the old school ADO.NET stuff you've done for a decade. Using various extension methods from POP Forums' data project, it was a breeze. You just have to stick to your decision, in this case. When you start messing with SQL directly, you can't go back in the same code to messing with entities because EF doesn't know what you're changing. Not really a big deal. There are a number of take-aways from using EF. The first is that you write a lot less code, which has always been a desired outcome of ORM's. The other lesson, and I particularly learned this the hard way working on the MSDN forums back in the day, is that trying to retrofit an ORM framework into an existing schema isn't fun at all. The CoasterBuzz database isn't bad, but there are design decisions I'd make differently if I were starting from scratch. Now that I have some of this stuff done, I feel like I can start to move on to the more interesting things on the backlog. There's a lot to do, but at least it's fun stuff, and not more forms that will be used infrequently.

    Read the article

  • Easter eggs as IP protection in software

    - by Simon
    I work in embedded software, and for some reason, management wants to hide an Easter egg as means of IP protection. They call it a watermark, and since our software interact with the video preview feed (the image displayed on a screen before you take a photo), they want me to implement a trigger which will react to some unusual video input (a video konami code like dark - bright - dark - bright - whatever). When this trigger fires, something strange happens (which is outside of the normal behavior of the software). The goal is to check whether our software is included in a device. Does it sound like a good idea? I have many argument against this move: What if the konami code is too sensitive and user triggers it? Does this kind of watermark have any legal value? What if this "feature" is discovered by the client? The performance penalty should be very small, since the soft run on small devices. I am the one developping this trigger. If things go wrong, what is my responsibility? What is your opinion about this method? I can't find a link, but I remember seeing an answer on this site suggesting that putting Easter eggs for protection purpose was a good idea. Has anyone tried it with good results?

    Read the article

  • Windows Server 2008: Terminal Services / VDI

    - by JohnyD
    I have a Dell R710 with 72GB of memory running Hyper-V. Within Hyper-V I have a Windows 2008 (32-bit) VM running Terminal Services. How do I allocate memory so that any user who connects to this Terminal Server (from their thin-client) is allocated 2GB (or whatever amount I choose) of memory? Currently I have provisioned the TS with 2GB of memory but it seems that this is shared among all that connect. Please let me know if there is further information I can provide. Thank you. Update 1: What I'm looking to accomplish with this server is setting up a VDI to allow users to connect from thin-clients from within our network. They will also have to connect from outside our network via VPN which is already in place. Am I able to set this up using Windows Server 2008 (not R2) because I have a 16-bit application which needs to be supported. Unfortunately it's not a candidate as a Remote App.

    Read the article

  • StreamInsight 2.1, meet LINQ

    - by Roman Schindlauer
    Someone recently called LINQ “magic” in my hearing. I leapt to LINQ’s defense immediately. Turns out some people don’t realize “magic” is can be a pejorative term. I thought LINQ needed demystification. Here’s your best demystification resource: http://blogs.msdn.com/b/mattwar/archive/2008/11/18/linq-links.aspx. I won’t repeat much of what Matt Warren says in his excellent series, but will talk about some core ideas and how they affect the 2.1 release of StreamInsight. Let’s tell the story of a LINQ query. Compile time It begins with some code: IQueryable<Product> products = ...; var query = from p in products             where p.Name == "Widget"             select p.ProductID; foreach (int id in query) {     ... When the code is compiled, the C# compiler (among other things) de-sugars the query expression (see C# spec section 7.16): ... var query = products.Where(p => p.Name == "Widget").Select(p => p.ProductID); ... Overload resolution subsequently binds the Queryable.Where<Product> and Queryable.Select<Product, int> extension methods (see C# spec sections 7.5 and 7.6.5). After overload resolution, the compiler knows something interesting about the anonymous functions (lambda syntax) in the de-sugared code: they must be converted to expression trees, i.e.,“an object structure that represents the structure of the anonymous function itself” (see C# spec section 6.5). The conversion is equivalent to the following rewrite: ... var prm1 = Expression.Parameter(typeof(Product), "p"); var prm2 = Expression.Parameter(typeof(Product), "p"); var query = Queryable.Select<Product, int>(     Queryable.Where<Product>(         products,         Expression.Lambda<Func<Product, bool>>(Expression.Property(prm1, "Name"), prm1)),         Expression.Lambda<Func<Product, int>>(Expression.Property(prm2, "ProductID"), prm2)); ... If the “products” expression had type IEnumerable<Product>, the compiler would have chosen the Enumerable.Where and Enumerable.Select extension methods instead, in which case the anonymous functions would have been converted to delegates. At this point, we’ve reduced the LINQ query to familiar code that will compile in C# 2.0. (Note that I’m using C# snippets to illustrate transformations that occur in the compiler, not to suggest a viable compiler design!) Runtime When the above program is executed, the Queryable.Where method is invoked. It takes two arguments. The first is an IQueryable<> instance that exposes an Expression property and a Provider property. The second is an expression tree. The Queryable.Where method implementation looks something like this: public static IQueryable<T> Where<T>(this IQueryable<T> source, Expression<Func<T, bool>> predicate) {     return source.Provider.CreateQuery<T>(     Expression.Call(this method, source.Expression, Expression.Quote(predicate))); } Notice that the method is really just composing a new expression tree that calls itself with arguments derived from the source and predicate arguments. Also notice that the query object returned from the method is associated with the same provider as the source query. By invoking operator methods, we’re constructing an expression tree that describes a query. Interestingly, the compiler and operator methods are colluding to construct a query expression tree. The important takeaway is that expression trees are built in one of two ways: (1) by the compiler when it sees an anonymous function that needs to be converted to an expression tree, and; (2) by a query operator method that constructs a new queryable object with an expression tree rooted in a call to the operator method (self-referential). Next we hit the foreach block. At this point, the power of LINQ queries becomes apparent. The provider is able to determine how the query expression tree is evaluated! The code that began our story was intentionally vague about the definition of the “products” collection. Maybe it is a queryable in-memory collection of products: var products = new[]     { new Product { Name = "Widget", ProductID = 1 } }.AsQueryable(); The in-memory LINQ provider works by rewriting Queryable method calls to Enumerable method calls in the query expression tree. It then compiles the expression tree and evaluates it. It should be mentioned that the provider does not blindly rewrite all Queryable calls. It only rewrites a call when its arguments have been rewritten in a way that introduces a type mismatch, e.g. the first argument to Queryable.Where<Product> being rewritten as an expression of type IEnumerable<Product> from IQueryable<Product>. The type mismatch is triggered initially by a “leaf” expression like the one associated with the AsQueryable query: when the provider recognizes one of its own leaf expressions, it replaces the expression with the original IEnumerable<> constant expression. I like to think of this rewrite process as “type irritation” because the rewritten leaf expression is like a foreign body that triggers an immune response (further rewrites) in the tree. The technique ensures that only those portions of the expression tree constructed by a particular provider are rewritten by that provider: no type irritation, no rewrite. Let’s consider the behavior of an alternative LINQ provider. If “products” is a collection created by a LINQ to SQL provider: var products = new NorthwindDataContext().Products; the provider rewrites the expression tree as a SQL query that is then evaluated by your favorite RDBMS. The predicate may ultimately be evaluated using an index! In this example, the expression associated with the Products property is the “leaf” expression. StreamInsight 2.1 For the in-memory LINQ to Objects provider, a leaf is an in-memory collection. For LINQ to SQL, a leaf is a table or view. When defining a “process” in StreamInsight 2.1, what is a leaf? To StreamInsight a leaf is logic: an adapter, a sequence, or even a query targeting an entirely different LINQ provider! How do we represent the logic? Remember that a standing query may outlive the client that provisioned it. A reference to a sequence object in the client application is therefore not terribly useful. But if we instead represent the code constructing the sequence as an expression, we can host the sequence in the server: using (var server = Server.Connect(...)) {     var app = server.Applications["my application"];     var source = app.DefineObservable(() => Observable.Range(0, 10, Scheduler.NewThread));     var query = from i in source where i % 2 == 0 select i; } Example 1: defining a source and composing a query Let’s look in more detail at what’s happening in example 1. We first connect to the remote server and retrieve an existing app. Next, we define a simple Reactive sequence using the Observable.Range method. Notice that the call to the Range method is in the body of an anonymous function. This is important because it means the source sequence definition is in the form of an expression, rather than simply an opaque reference to an IObservable<int> object. The variation in Example 2 fails. Although it looks similar, the sequence is now a reference to an in-memory observable collection: var local = Observable.Range(0, 10, Scheduler.NewThread); var source = app.DefineObservable(() => local); // can’t serialize ‘local’! Example 2: error referencing unserializable local object The Define* methods support definitions of operator tree leaves that target the StreamInsight server. These methods all have the same basic structure. The definition argument is a lambda expression taking between 0 and 16 arguments and returning a source or sink. The method returns a proxy for the source or sink that can then be used for the usual style of LINQ query composition. The “define” methods exploit the compile-time C# feature that converts anonymous functions into translatable expression trees! Query composition exploits the runtime pattern that allows expression trees to be constructed by operators taking queryable and expression (Expression<>) arguments. The practical upshot: once you’ve Defined a source, you can compose LINQ queries in the familiar way using query expressions and operator combinators. Notably, queries can be composed using pull-sequences (LINQ to Objects IQueryable<> inputs), push sequences (Reactive IQbservable<> inputs), and temporal sequences (StreamInsight IQStreamable<> inputs). You can even construct processes that span these three domains using “bridge” method overloads (ToEnumerable, ToObservable and To*Streamable). Finally, the targeted rewrite via type irritation pattern is used to ensure that StreamInsight computations can leverage other LINQ providers as well. Consider the following example (this example depends on Interactive Extensions): var source = app.DefineEnumerable((int id) =>     EnumerableEx.Using(() =>         new NorthwindDataContext(), context =>             from p in context.Products             where p.ProductID == id             select p.ProductName)); Within the definition, StreamInsight has no reason to suspect that it ‘owns’ the Queryable.Where and Queryable.Select calls, and it can therefore defer to LINQ to SQL! Let’s use this source in the context of a StreamInsight process: var sink = app.DefineObserver(() => Observer.Create<string>(Console.WriteLine)); var query = from name in source(1).ToObservable()             where name == "Widget"             select name; using (query.Bind(sink).Run("process")) {     ... } When we run the binding, the source portion which filters on product ID and projects the product name is evaluated by SQL Server. Outside of the definition, responsibility for evaluation shifts to the StreamInsight server where we create a bridge to the Reactive Framework (using ToObservable) and evaluate an additional predicate. It’s incredibly easy to define computations that span multiple domains using these new features in StreamInsight 2.1! Regards, The StreamInsight Team

    Read the article

  • Get the "source network address" in Event ID 529 audit entries on Windows XP

    - by Make it useful Keep it simple
    In windows server 2003 when an Event 529 (logon failure) occures with a logon type of 10 (remote logon), the source network IP address is recorded in the event log. On a windows XP machine, this (and some other details) are omitted. If a bot is trying a brute force over RDP (some of my XP machines are (and need to be) exposed with a public IP address), i cannot see the originating IP address so i don't know what to block (with a script i run every few minutes). The DC does not log this detail either when the logon attempt is to the client xp machine and the DC is only asked to authenticate the credentials. Any help getting this detail in the log would be appreciated.

    Read the article

  • Debian pure-ftpd, Restrict access

    - by durduvakis
    I am running Debian Wheezy, with ISPConfig 3, plus ModSecurity and I would like to restrict access to ftp to specific IP(s) globally (not to specific ftp users only), that can be either 127.0.0.1 or one I would manually add later. I would also like to completely disable ftp access from the web, but allow only from ftp-client software (if that is possible). The idea of closing firewall ports is not what I want. I know I can do this setting some firewall rule though, but that is not what I currently need. I have managed to do this for example on phpmyadmin inside it's .conf file, but unfortunately I cannot find any configuration to alter for pure-ftpd in my system. Restricting web-ftp access maybe possible by adding some rule in apache2 conf, but I am not sure how to write such a rule. Thanks to everyone that can help cheers

    Read the article

  • Connecting to FileZilla FTP Server on XAMPP localhost with Dreamweaver

    - by Keyslinger
    I am running XAMPP 1.7.4. I've installed the FileZilla FTP server and I'm running it as a service. I created a user and when I connect as that user with the FileZilla client, I have no trouble connecting. However, in Dreamweaver CS5 when I create a site using the Manage Sites dialog and go to configure its FTP settings, I get a message that reads An FTP error occurred - cannot make connection to host. and goes on to list some possible causes and solutions. I have tried setting the connection as passive to no avail. I understand that it is not necessary to use FTP when I am editing the site locally, but it is for a Dreamweaver class I am teaching and I want my students to learn to use Dreamweaver's FTP tools. How can I make my localhost FTP connection work in Dreamweaver?

    Read the article

  • How can I copy the output from a remote command into the local clipboard?

    - by cwd
    I use iTerm2 as my terminal client in Mac OS X. On the local system I can use pbcopy and pbpaste to transfer data between the system clipboard and the terminal, but of course this doesn't work when you're ssh'ed to another machine. Is there some way which I can take the result of a command and copy it to the clipboard automatically? Perhaps an applescript to grab the text on the iTerm windows, then get the next to last line? For instance, if I wanted to copy the current working directory: I run pwd, then use the mouse to select the text, and then press command + c. Is there any better / faster / automatic way of doing this? I'm not looking for a bulletproof solution that would work for every command (eg: might not work when there is a huge scrollback) - I'm just looking for something to make this task that I do quite often a little less tedious. Update I'm looking into using screen to do this, but I'm still not sure if it is possible.

    Read the article

  • How to extract jpegs from a video file using ffmpeg

    - by Andrew Simpson
    I am using C# and ffmpeg. In this scenario I have 279 individual jpegs and i have used ffmpeg to create a AVI file from these images on my client. CMD Line: -f image2 -r 10 -i "C:\000EC902F17F\img%05d.jpg" -s 352x288 -y "C:\1\test.avi" I then upload to my server. CMD Line: -i c:\1\1.avi c:\1\img-%05d.jpg I then extract jpegs from the AVI file. I get 265 jpegs back. Obviously ffmpeg is dropping these frames (most probable) when the avi is 1st created. Is there a way to 'force' to encode using ALL the images I have? Thanks. PS I did not specify any command line option other than the size of the video output. As far as I am aware if none are specified then ffmpeg automatically chooses the best ones?

    Read the article

  • Browser extension (or other software) to delay page load

    - by Doug Harris
    The alt text to today's comic at xkcd.com (strip below) says: After years of trying, I broke this habit in a day by decoupling the action and the neurological reward. I set up a simple 30-second delay I had to wait through, in which I couldn't do anything else, before any new page or chat client would load (and only allowed one to run at once). The urge to check all those sites magically vanished--and my 'productive' computer use was unaffected. (bold is my emphasis) Does anybody know of a browser extension or other software that will add this sort of delay? I've seen extensions which simply block sites, but not a delay like this.

    Read the article

  • How are SQL Server CALs counted?

    - by Sam
    Running a SQL Server, as far as I understand it, you need one CAL for every user who connects to the database server. But what happens if the only computer which is accessing the SQL Server is the server running your business layer? If, for example, you got 1 SQL Server and 1 Business logic server, and 100 Clients who all just query and use the business logic server. No client is using the SQL Server directly, no one is even allowed to contact it. So, since there is only one computer using the SQL server, do I need only 1 CAL??? I somehow can't believe this would count as only 1 CAL needed for the SQL Server, but I would like to know why not.

    Read the article

  • I want to version control my entire slice

    - by Tom
    I'm renting a slice (i.e., a VPS) from Slicehost. I've a spent a day or two filling up /usr with my favorite packages, /etc with configs and init scripts, and so on. Now I want to: save this whole setup somewhere (e.g., to load onto another machine). see what changes I've made to which files revert changes, tag revisions, and all that other good version control stuff Saving a disk image gives me (1), but not (2) and (3). Using Subversion (svn import / svn://someotherhost) might give me all three, but I expect problems if I actually try to check a project out into / and maintain .svn directories in root-owned areas. And to load my setup onto a fresh slice, I'd need to install an svn client on it first. Is there a good way to do what I want to do?

    Read the article

  • Diskless with Ubuntu 12.04

    - by user139462
    I'm trying to setup a new diskless solution with ubuntu 12.04 without any success. I followed this howto: https://help.ubuntu.com/community/DisklessUbuntuHowto But the initramfs seems not to be able to mount my nfs share. On my server side: My /etc/exports /srv/nfs4 192.168.0.0/24(fsid=0,rw,no_subtree_check) /srv/nfs4/nfsroot 192.168.0.0/24(rw,no_root_squash,no_subtree_check,fsid=1,nohide,insecure,sync) I'm able to mount my nfs share on standard Ubuntu installation without any problem. I can mount my nfs on any client with those commands: mount 192.168.0.3:/nfsroot /mnt or mount 192.168.0.3:/srv/nfs4/nfsroot /mnt My /tftpboot/pxelinux.cfg/default config file is DEFAULT vmlinuz-3.5.0-25-generic root=/dev/nfs initrd=initrd.img-3.5.0-25-generic nfsroot=192.168.0.3:/nfsroot ip=dhcp rw I also tried DEFAULT vmlinuz-3.5.0-25-generic root=/dev/nfs initrd=initrd.img-3.5.0-25-generic nfsroot=192.168.0.3:/srv/nfs4/nfsroot ip=dhcp rw. What I got in initramfs: With the setting [nfsroot=192.168.0.3:/nfsroot] Diskless output: mount call failed - server replied: Permission denied On Syslog of my nfs server: rpc.mountd[1266]: refused mount request from 192.168.0.10 for /nfsroot (/): not exported With the setting [nfsroot=192.168.0.3:/srv/nfs4/nfsroot] Diskless output: mount: the kernel lacks NFS v3 support On Syslog of my nfs server I got: Mar 11 14:03:06 BootFromLan rpc.mountd[1266]: authenticated mount request from 192.168.0.10:834 for /srv/nfs4/nfsroot (/srv/nfs4/nfsroot) Mar 11 14:03:06 BootFromLan rpc.mountd[1266]: refused unmount request from 192.168.0.10 for /root (/): not exported

    Read the article

  • How can I reroute a sub-domain to localhost + port number?

    - by urig
    I have several web applications running on my developer machine. They mimic our production web applications which are hosted on sub-domain. For example, consider: api.myserver.com - is mimicked by 127.0.0.1:8000 www.myserver.com - is mimicked by 127.0.0.1:8008 and so on... How can I make it so that, on my Windows 7 machine, HTTP calls to "api.myserver.com" (note the lack of port number) are redirected to 127.0.0.1:8000 etc? Note that this needs to apply both to client-side calls (in the browser) and server-side calls (from IIS to Python development server and vice versa). Do I need a proxy to run locally to achieve this? Can you recommend such a tool?

    Read the article

  • Windows 2003 SMTP virtual server, why emails are not delivered?

    - by bardan
    Configured Windows 2003 as my email client, everything works fine with POP3 (i'm able to recieve emails), the problem is with SMTP and i can't figure out how to find where excatly this problem is, because email looks like it is sent, but recipients don't recieve anything... i had some problems with relying, but fixed everything, and now i configured outlook express on the same machine, trying to send emails and it looks everything fine, email goes to SENT folder, no errors, but recipients (tried several diffrent) don't revieve any letters... tried to test from the same machine with telnet like it described there http://support.microsoft.com/kb/153119 ant everything looks ok...

    Read the article

  • As a developer, how do I learn sales? [closed]

    - by Dan Abramov
    I quit the company I was working for to pursuit an opportunity as a startup, and I believe in our product. I'm sure it's going to be great if we attract some customers first to keep going. (I don't want funding.) Our product is targeted at private schools and courses, and helps organize the mess other LMSs introduce. The problem is, our team is basically just me and I have very little idea about sales and marketing. I can do reasonably good copywriting but I'm sure I can do better—and being nervous or too techy in a real world conversation with the client doesn't help. I want to get better, in fact, a lot better at negotiating with clients and pitching my product. I did look for some “sales articles” on the web, and a lot of what I found is plain bullshit on SEO-engineered websites promoting books or $5000 courses. What I need instead is a developer's perspective on how to sale a product you think is great. What are typical programmer's mistakes and misconceptions about sales, and how to avoid them? How do you evolve into a reasonably great salesman? I can't believe it's in the mindset and unlearnable. Your own experience, combined with great articles available on the web is most welcome. To Future Readers The question got closed because it is not a good fit for this site. I found some helpful tips in a similar question asked on a sister StackExchange site about startups: I'm a terrible salesperson. What can I do about it?

    Read the article

  • Outlook 2007 Clients attachments not opening correctly

    - by Az
    Hello, Thanks for taking the time to read this. We are having an issue on our Terminal Server clients that has me perplexed. We are running Server2003 x64 and Exchange 2007 environment. When we attempt to open an attachment for example .jpeg and the user selects to open the file, it prompts them to select a program to open the file with. The file associations apear to be fine, if I save the document to the desktop and open it, it opens with the correct program automatically. If i select "always open using this program" it will then open automatically, but I don't want to have to do this for all file types we open regularly on each client. Is this some sort of Exchange server security setting that is forcing them to associate the file? Does outlook or exchange maintain its own file association database? Thanks for reading!

    Read the article

  • Remote Control Home PC from Corporate Work PC

    - by muncherelli
    Here is my situation: I am currently on a Windows XP workstation at work. I have an android tablet that I use to splashtop into my home PC. I would like to be able to use my work keyboard and mouse to control my home PC while I am splashtop'd into it using my tablet. My work PC is on a corporate LAN, and not on the same network as my tablet. The company I work for provides wifi for personal devices, but they are not accessable to the internal network. I thought about going the Synergy route, however that would require my home PC to be able to connect to my work PC which isn't really possible. The opposite would work though, if I could reverse connect the server to the client, but the Synergy software doesn't really support that. I do have a couple linux boxes running at home, so I can ssh into my home network and tunnel ports via SSH if needed. With what I have, how can I accomplish seamless keyboard and mouse sharing between my work PC and either my home PC or my android tablet?

    Read the article

  • New website - best practice for requirements specs? [closed]

    - by Alex K.
    Possible Duplicate: Extracting user requirements from a person who does not know how to express himself As a hobby freelancer I'm new to this. I've never had a non-technical client before explain to me what his future website is supposed to do. A person wants me to make a website for him and he basically explained to me what's it about. However, he's not a technical person and he just doesn't understand what I need to know and how to properly describe/explain it to me. When I ask him how a user is supposed to submit an entry to the website he told me "He fills out a form.", which is not really helping me. This was just an example, it goes on for other sections of the website as well which are a lot harder to explain. The website will be aimed at a specific professional user demographic and I have no clue about their profession and how their industry works. I tried to find some good Product Requirements Document templates on Google but none of them really seemed like they could help him understand how to write it so I can understand what he wants/needs. Can somebody please give me a hint on how to deal with such non-technical clients?

    Read the article

  • How to rsync a large file, with as little CPU and bandwidth expense as possible?

    - by Johan Allgoth
    I have a 500 GB file that I plan on backing up remotely. The file changes often. I'll be rsyncing it from a desktop to a server. Both can run rsync client or server. What is the proper command for this? The ones I've tried sofar has been taking forever or simply acted strange. Example and results: rsync -cv --partial --inplace --no-whole-file /desktop/file1 myserver.com::module/file1 Seems to work, but only if I do it twice (?!). Also, slow. Does the above command do the checksumming on both computers, or only on the sending one? Is it correct otherwise?

    Read the article

  • CodePlex Daily Summary for Friday, August 08, 2014

    CodePlex Daily Summary for Friday, August 08, 2014Popular ReleasesSpace Engineers Server Manager: SESM V1.13: V1.13 - Added the restore option in the backup manager - Reenabled the map upload for the managersStrata v1.1 - Adobe Photoshop Like graphics editor: Strata 1.1: Version Strata 1.1: Available feutures: Drawing a layer; Adding a layer; Deleting a layer; Moving layers; Changing the order of layers; Hide / show layer; Saving as Project; Saving as Picture format (Jpg, Png, Gif) All interface in russian language. Saving - is Ok, But Opening have some display troublesEssence#: Nile (Alpha Build 20): The Nile release introduces ANSI-conformant streams into Essence#. It also fixes some significant bugs, and provides new utility scripts for use in developing Essence# code. The name Nile was chosen because it's the name of a rather big stream that's mentioned in the Bible--and we've been using a Biblical naming scheme for the Alpha releases. Recall that Moses as a babe was found among reeds along the banks of the Nile. So the Nile is a reed stream... FileStream Usage ExamplesTwo new exa...Instant Beautiful Browsing: IBB 14.3 Alpha: An alpha release of IBB. After 3 years of the last release this version is made from scratch, with tons of new features like: Make your own IBB aps. HTML 5. Better UI. Extreme Windows 8 resemblance. Photos. Store. Movement TONS of times smother compared to previous versions. Remember that this is AN ALPHA release, I hope I will have "IBB 14" finished by December. The documentation on how to create a new application for IBB will come next monthjQuery List DragSort: jQuery List DragSort 0.5.2: Fixed scrollContainer removing deprecated use of $.browser so should now work with latest version of jQuery. Added the ability to return false in dragEnd to revert sort order Project changes Added nuget package for dragsort https://www.nuget.org/packages/dragsort Converted repository from SVN to MercurialWix# (WixSharp) - managed interface for WiX: Release 1.0.0.0: Release 1.0.0.0 Custom UI Custom MSI Dialog Custom CLR Dialog External UIRecaptcha for .NET: Recaptcha for .NET v1.6.0: What's New?Bug fixes Optimized codeMath.NET Numerics: Math.NET Numerics v3.2.0: Linear Algebra: Vector.Map2 (map2 in F#), storage-optimized Linear Algebra: fix RemoveColumn/Row early index bound check (was not strict enough) Statistics: Entropy ~Jeff Mastry Interpolation: use Array.BinarySearch instead of local implementation ~Candy Chiu Resources: fix a corrupted exception message string Portable Build: support .Net 4.0 as well by using profile 328 instead of 344. .Net 3.5: F# extensions now support .Net 3.5 as well .Net 3.5: NuGet package now contains pro...AutomatedLab: AutomatedLab 2.2.0.0: 2.2.0 Support for Subordinate Certificate Authorities Installing software does no longer use workflows but background jobs, which is much faster Many performance improvements Removing a lab does no longer require to import it first if using the Path parameter Adjusted all sample scripts to work with version 2.x New validators to verify virtual switch settings Bug fixing 2.1.0 Support for external virtual switches CaRoot is a new role for installing Root Certificate Authorities ...babelua: 1.6.5.1: V1.6.5.1 - 2014.8.7New feature: Formatting code; Stability improvement: fix a bug that pop up error "System.Net.WebResponse EndGetResponse";Virto Commerce Enterprise Open Source eCommerce Platform (asp.net mvc): Virto Commerce 1.11: Virto Commerce Community Edition version 1.11. To install the SDK package, please refer to SDK getting started documentation To configure source code package, please refer to Source code getting started documentation This release includes many bug fixes and minor improvements. More details about this release can be found on our blog at http://blog.virtocommerce.com.Online Resume Parsing Using Aspose.Words for .NET: Resume_Parser: First Release of Resume Parser Application using Aspose.Words for .NET.Blade.Net: 3.0.0.0: Blade.Controls added: collection of MVVM and Prism friendly WPF controls and utilities InteractionRequest based implementation for OpenFile/SafeFile dialogs InteractionRequest based implementation for print dialog Drag&Drop behaviors Focus behaviors TextBox behaviors PopupWindowActionRegionAdapter PropagateInputBindingsToWindowBehavior Blade.Forest added: simple backlog tool Backlog items are structured via drag&drop in one tree Backlog items dragged to and structured in sepa...BBImageHandler - An image generator for DotNetNuke and ASP.NET: 01.06.00: Release notes V 1.6.0Added 2 configuration properties: ServerCacheExpiration (value in seconds,Default: 600 seconds) + ClientCacheExpiration (value in seconds, Default: 300 seconds) Fixed Client Caching (now sending 304 when cache time is not expired) Fixed bug when attaching watermark to indexed image formatRole Based View for Microsoft Dynamic CRM 2011 & 2013: Role based view for MS CRM 2011 Ver. 1.0: One of the very use full features called “Role Based View” in Microsoft Dynamics CRM 2011 & 2013 has been ignored by Microsoft. Now the functionality is provided with this small solution which will allowing you to have different views for an entity that can be assigned to different security roles. For example, Certain Views will not appear for the Sales person since their security level is lower than that of the Sales Manager. However Sales Manager can able to see those additional view.Json.NET: Json.NET 6.0 Release 4: New feature - Added Merge to LINQ to JSON New feature - Added JValue.CreateNull and JValue.CreateUndefined New feature - Added Windows Phone 8.1 support to .NET 4.0 portable assembly New feature - Added OverrideCreator to JsonObjectContract New feature - Added support for overriding the creation of interfaces and abstract types New feature - Added support for reading UUID BSON binary values as a Guid New feature - Added MetadataPropertyHandling.Ignore New feature - Improv...VidCoder: 1.5.24 Beta: Added NL-Means denoiser. Updated HandBrake core to SVN 6254. Added extra error handling to DVD player code to avoid a crash when the player was moved.PowerShell App Deployment Toolkit: PowerShell App Deployment Toolkit v3.1.5: *Added Send-Keys function to send a sequence of keys to an application window (Thanks to mmashwani) *Added 3 optimization/stability improvements to Execute-Process following MS best practice (Thanks to mmashwani) *Fixed issue where Execute-MSI did not use value from XML file for uninstall but instead ran all uninstalls silently by default *Fixed error on 1641 exit code (should be a success like 3010) *Fixed issue with error handling in Invoke-SCCMTask *Fixed issue with deferral dates where th...SEToolbox: SEToolbox 01.041.012 Release 1: Added voxel material textures to read in with mods. Fixed missing texture replacements for mods. Fixed rounding issue in raytrace code. Fixed repair issue with corrupt checkpoint file. Fixed issue with updated SE binaries 01.041.012 using new container configuration.Magick.NET: Magick.NET 6.8.9.601: Magick.NET linked with ImageMagick 6.8.9.6 Breaking changes: - Changed arguments for the Map method of MagickImage. - QuantizeSettings uses Riemersma by default.New ProjectsBaidu BCS: Baidu BCS (Baidu Cloud Storage) Server SDK. ????????SDK. .NET 4.0 or above. .NET 4.0?????Close Eye Assistant: a Close Eye Assistant using python3Dynamics AX IEIDE Project Explorer: This project is aiming to provide a set of useful features for Dynamics AX developers as well as administrators and is provided as an installable axmodel file.EFlogger - profiler for Entity Framework: Free and simple open source profiler for Entity Framework from 4-6 versionGuitar: i will continue to improve it.jQuery Table Pager Plugin: Simple jQuery plugin to paginate a table. JSJD: ????LFramework: LFrameworkMSDIS MVC Web: Sample ASP.NET MVC 4.0 using AngularJSPJS2: Not complete Powershell implementation in JavaScript.Seemile: SeemileShippety: Shippety is a web application for buying and printing postage labels. It's a client implementation for the EasyPost API written in HTML, Javascript and ASP.NET.Spending Monitor: With this application the user has the ability to create categories, retailers, and payment methods to track their spending in those categories. Strata v1.1 - Adobe Photoshop Like graphics editor: Strata - a graphical editor with layers mechanism similar to Adobe Photoshop. Layer - flat two-dimensional bitmap, where you can draw with the mouse.surfingkata: aTIKSN PowerShell Cmdlets: Bunch of cmdlets developed by TIKSN Lab???????: dfg???????: ????????????????: cvbvcbcv

    Read the article

  • Altq limits not being applied to UDP transfers

    - by overkordbaever
    I have a OpenBSD server acting as a router/firewall with yhr packet filter ruleset shown below, a linux server, and a linux client. When transferring files (using netcat) by TCP, the limits are applied (for example the 100mbit limit in the example), though when transferring data by UDP, the limits aren't applied; the file always takes the same amount of time no matter the queue bandwidth limit I set (I can even turn off the queues completely, and will still get the same result). Why aren't the queuing rules applied to UDP packages? The rules used: #queue rules altq on { $int_if, $ext_if } cbq bandwidth 100Mb queue { def, low } queue def bandwidth 0Mb cbq(default) queue low bandwidth 100Mb cbq #Passrules test pass out quick from $int_if to $ext_if queue low pass in quick from $ext_if to $int_if queue low pass out quick from $ext_if to $int_if queue low pass in quick from $int_if to $ext_if queue low I suppose this may be related a question I've previously asked, though since it's more of a separate question, I suppose a separate question should be used for this

    Read the article

  • How to broadcast a command on Windows

    - by Xiao Jia
    I am going to frequently deploy different versions of a program on a cluster of Windows machines (mostly Windows XP), so I am willing to use a command-line broadcasting tool (either built-in or 3rd-party) to (1) download a file from some URL, and (2) execute the same command, on all the machines. I googled for a very long time but got nothing related to my goal. (Only pages about broadcasting a message, broadcasting ping, or programmatically broadcast via TCP/IP, etc.) Are there any tool for this purpose? Or is it possible to do it pragmatically (without installing extra client programs on those machines)?

    Read the article

  • Getting Synergy+ to work with Windows 7 64bit and Windows XP 32bit?

    - by john crisp
    My Windows XP box is Dell 4550 dimension. Windows 7 box is Home Premium. Synergy + installs fine on the Windows 7box, but using Windows xp as the client, it won't work? Do I need to disconnect the vga connection from my monitor and control entirely from the win7 dvi connection? In setting up ( links ) on Windows 7 as the server, I have NO way of inputing info? The setup on the Windows xp box is simple and does so OK, but I can't connect to the Windows xp box! It must be the vga connection on the Windows XP box. In order to use both machines, I have to turn one off and the other on. Hence-the VGA is needed on Windows XP. Any info/help most appreciated.

    Read the article

  • Upgrading from MySQL Server to MariaDB

    - by Korrupzion
    I've heard that MariaDB has better performance than MySQL-Server. I'm running software that makes an intensive use of MySQL, thats why I want to try upgrading to MariaDB. Please tell me your experiences doing this conversion, and instructions or tips. Also, which files I should take care of for making a backup of MySQL-Server, so if something goes wrong with MariaDB, I could rollback to MySQL without issues? I would use this but i'm not sure if it's enough to get a full backup of MySQL-Server confs and databases mysqldump --all-databases backup /etc/mysql My Environment: uname -a (Debian Lenny) Linux charizard 2.6.26-2-amd64 #1 SMP Thu Sep 16 15:56:38 UTC 2010 x86_64 GNU/Linux MySQL Server Version: Server version 5.0.51a-24+lenny4 MySQL Client: 5.0.51a Statistics: Threads: 25 Questions: 14690861 Slow queries: 9 Opens: 21428 Flush tables: 1 Open tables: 128 Queries per second avg: 162.666 Uptime: 1 day 1 hour 5 min 13 sec Thanks! PS: Rate my english :D

    Read the article

< Previous Page | 645 646 647 648 649 650 651 652 653 654 655 656  | Next Page >