Search Results

Search found 13937 results on 558 pages for 'save money'.

Page 94/558 | < Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >

  • SSH stops at "using username" with IPTables in effect

    - by Rautamiekka
    We used UFW but couldn't make the Source Dedicated ports open, which was weird, so we purged UFW and switched to IPTables, using Webmin to configure. If the inbound chain is on DENY and SSH port open [judged from Webmin], PuTTY will say using username "root" and stops at that instead of asking for public key pw. Inbound chain on ACCEPT the pw is asked. This problem didn't happen with UFW. Picture of IPTables configuration in Webmin: http://s284544448.onlinehome.us/public/PlusLINE%20Dedicated%20Server,%20Webmin,%20IPTables,%200.jpgThe address is to the previous rautamiekka.org. iptables-save when on INPUT DENY: # Generated by iptables-save v1.4.8 on Wed Apr 11 16:09:20 2012 *mangle :PREROUTING ACCEPT [1430:156843] :INPUT ACCEPT [1430:156843] :FORWARD ACCEPT [0:0] :OUTPUT ACCEPT [1415:781598] :POSTROUTING ACCEPT [1415:781598] COMMIT # Completed on Wed Apr 11 16:09:20 2012 # Generated by iptables-save v1.4.8 on Wed Apr 11 16:09:20 2012 *nat :PREROUTING ACCEPT [2:104] :POSTROUTING ACCEPT [0:0] :OUTPUT ACCEPT [0:0] COMMIT # Completed on Wed Apr 11 16:09:20 2012 # Generated by iptables-save v1.4.8 on Wed Apr 11 16:09:20 2012 *filter :INPUT DROP [0:0] :FORWARD ACCEPT [0:0] :OUTPUT ACCEPT [1247:708906] -A INPUT -i lo -m comment --comment "Machine-within traffic - always allowed" -j ACCEPT -A INPUT -p tcp -m comment --comment "Services - TCP" -m tcp -m multiport --dports 22,80,443,10000,20,21 -m state --state NEW,ESTABLISHED -j ACCEPT -A INPUT -p tcp -m comment --comment "Minecraft - TCP" -m tcp --dport 25565 -j ACCEPT -A INPUT -p udp -m comment --comment "Minecraft - UDP" -m udp --dport 25565 -j ACCEPT -A INPUT -p tcp -m comment --comment "Source Dedicated - TCP" -m tcp --dport 27015 -j ACCEPT -A INPUT -p udp -m comment --comment "Source Dedicated - UDP" -m udp -m multiport --dports 4380,27000:27030 -j ACCEPT -A INPUT -p udp -m comment --comment "TS3 - UDP - main port" -m udp --dport 9987 -j ACCEPT -A INPUT -p tcp -m comment --comment "TS3 - TCP - ServerQuery" -m tcp --dport 10011 -j ACCEPT -A OUTPUT -o lo -m comment --comment "Machine-within traffic - always allowed" -j ACCEPT COMMIT # Completed on Wed Apr 11 16:09:20 2012 iptables --list when on INPUT DENY: Chain INPUT (policy DROP) target prot opt source destination ACCEPT all -- anywhere anywhere /* Machine-within traffic - always allowed */ ACCEPT tcp -- anywhere anywhere /* Services - TCP */ tcp multiport dports ssh,www,https,webmin,ftp-data,ftp state NEW,ESTABLISHED ACCEPT tcp -- anywhere anywhere /* Minecraft - TCP */ tcp dpt:25565 ACCEPT udp -- anywhere anywhere /* Minecraft - UDP */ udp dpt:25565 ACCEPT tcp -- anywhere anywhere /* Source Dedicated - TCP */ tcp dpt:27015 ACCEPT udp -- anywhere anywhere /* Source Dedicated - UDP */ udp multiport dports 4380,27000:27030 ACCEPT udp -- anywhere anywhere /* TS3 - UDP - main port */ udp dpt:9987 ACCEPT tcp -- anywhere anywhere /* TS3 - TCP - ServerQuery */ tcp dpt:10011 Chain FORWARD (policy ACCEPT) target prot opt source destination Chain OUTPUT (policy ACCEPT) target prot opt source destination ACCEPT all -- anywhere anywhere /* Machine-within traffic - always allowed */ The UFW rules prior to purging on INPUT DENY: 127.0.0.1 ALLOW IN 127.0.0.1 3306 DENY IN Anywhere 20,21/tcp ALLOW IN Anywhere 22/tcp (OpenSSH) ALLOW IN Anywhere 80/tcp ALLOW IN Anywhere 443/tcp ALLOW IN Anywhere 989 ALLOW IN Anywhere 990 ALLOW IN Anywhere 8075/tcp ALLOW IN Anywhere 9987/udp ALLOW IN Anywhere 10000/tcp ALLOW IN Anywhere 10011/tcp ALLOW IN Anywhere 25565/tcp ALLOW IN Anywhere 27000:27030/tcp ALLOW IN Anywhere 4380/udp ALLOW IN Anywhere 27014:27050/tcp ALLOW IN Anywhere 30033/tcp ALLOW IN Anywhere

    Read the article

  • Few question on windows explorer properties(win 7)

    - by Nrew
    I've red this article from howtogeek, but it didn't explain this one which is placed in the target portion when you right click on windows explorer and click properties: %windir%\explorer.exe shell:desktop\Inbox And why does local disk E: shows up when I have this one: %windir%\explorer.exe shell:E:\FINAL SAVE DATA I don't really get the code, especially the part in shell: desktop\Inbox. What's that supposed to mean. How do I change it so that when I click on the Windows Explorer shortcut, I get to see this location: E:\FINAL SAVE DATA

    Read the article

  • I hyperlinked a cell in excel 2003, formula issues?

    - by joseinsomniac
    I have a budget spreadsheet using excel 2003. I have My deposit, then all of my bills, the total, then a cell that has the difference(between the amount of deposit and the total of the bills). The difference cell numbers turn red when I dont have enough money (deposit vs bill total). I hyperlinked the difference cell to a checkbook register spreadsheet so I can track where all my extra money went(reconsile receipts daily). When hyperlinked the numbers are blue. I need the numbers to stay black(when above 0.00) and stay red (when the numbers are below 0.00) and not change after the link has been clicked on. Also if the link has not been clicked on, and the numbers are red, the font is smaller, even though the toolbar shows the font size hasnt changed. After I click on it and go back to the budget sheet, its the size it should be. Any Ideas? Thanks!

    Read the article

  • how to use webdav with netbeans?

    - by ajsie
    i have a server i save a lot of files, i wonder how i use webdav with netbeans. eg. import a webdav folder edit the php files and save it the new changes apply immediately on the web server (cause the webdav folder is the same as the apache web root folder) i think this is a very smart solution so that you can change the website immediately, without having to download, change and upload. anyone that knows how to accomplish this? (feel free to propose other quick solutions for how to do quick edits)

    Read the article

  • Backing up a Linux VPS with RSync to Vista

    - by Frank
    I've been working to setup a Linux VPS to host a couple of Wordpress sites and eventually a Mercurial server. I've setup one site and things have gone well. However, before I start moving other things to the VPS, I need to setup a backup solution. My provider, Linode, suggest RSync (among a couple of other options) to do backups. I've seen a few posts on this site that suggests other backup solutions including going to the Amazon Cloud but that costs money and the VPS is all the money I want to spend on this for the time being. So, to help solve that I want to have my backup computer be my home desktop computer. Assuming I'm using RSync, is it possible to use my Vista based home computer to become the destination for the backup? And if it is possible, what type of command or connection would I need to configure on the vista machine? Any insight would be helpful. It's probably obvious, but I've never used RSync.

    Read the article

  • Compute number of occurrences in a column of a spreadsheet

    - by wnstnsmth
    I have a Google Drive spreadsheet with a single column that holds string values (Twitter screen names) such as "user1", "user1", "UserX", and I would like to count those values so that I can easily craft a bar chart out of it. So the result should be value occurrence ----------------------- user1 2 UserX 1 ... .... Please note, I only want to look for whole words, and not part words. EG, the words 'on' and 'one' appears in the word 'money' - I would not count this (eg, only the word money is counted). Hope that is clear enough. What formula should I use?

    Read the article

  • How do you update an Excel file (Data Refresh and update formulas) WITHOUT opening the file?

    - by Alex
    I have an Excel file that want to update and save automatically with out having to open it or manually interact with. Manually, I open the file up and hit data refresh which goes and does a SQL query and then hit F9 for the formulas to update and then I just close/save. (I then would mail the file out to people using a perl script or use SAS JMP to run some numbers/charts and also mail them out. Basically I need to script some things which require the XLS file to be updated.)

    Read the article

  • Generating and collaborating on network map diagrams?

    - by Ian C.
    I have to turn out some network topology maps for very large networks. I'd like the format for the maps to be something other people can also edit and contribute to regardless of what software I'm using on my Mac to build them. I don't mind spending money on my end for software, but I can't require that my clients spend any money. I also can't promise my clients are also using OS X -- they could be running Linux or Windows. Is there a best software application on OS X for producing maps that I can share with other, non-OS X, users? Is there a best format for sharing topology maps that I should use when exporting the maps to disk?

    Read the article

  • How to create a default association for files with no extension

    - by acidzombie24
    I am using http://www.pnotepad.org/ (i wouldnt mind switching to something else if it lightweight and has most/all features i like which i'll check on a app by app basis) When i create a new tab/file and save it unless i write .txt i get a file with no extention. Which makes it hard to open since i cant double click it (i dont think i can tell win7 to set a default app for files with no extension) How do i make pnotepad save with a .txt when non are specified?

    Read the article

  • Amortization Schedule in Excel - Know how much interest will be saved by large payment

    - by hubbas
    I have a really nice Amortization Schedule built in Excel using the steps from this page: http://www.wikihow.com/Prepare-Amortization-Schedule-in-Excel It works really nicely, but I am planning to make some large payments and I would love to calculate how much interest I will save, over the life of the loan, for making these larger payment. E.g., if I pay $10k for one payment I will save $4000 in interest over the life of the loan, etc. Is there a way to calculate this?

    Read the article

  • make programmer notepad default files as .txt?

    - by acidzombie24
    I am using http://www.pnotepad.org/ (i wouldnt mind switching to something else if it lightweight and has most/all features i like which i'll check on a app by app basis) When i create a new tab/file and save it unless i write .txt i get a file with no extention. Which makes it hard to open since i cant double click it (i dont think i can tell win7 to set a default app for files with no extension) How do i make pnotepad save with a .txt when non are specified?

    Read the article

  • Get lots of javascript problems when using Opera 11.00 to surf

    - by s hanley
    Sites like ebay and even superuser stop working properly when I use opera 11.00. Menus stop working everywhere from ebay to godaddy. Hovering on a menu item doesn't expand it, no sub menu slides out. This makes a large number of very popular websites unusable. Am I right in assuming this is a javascript issue? I use opera for the turbo feature (I have tested opera with and without turbo so it's not turbo's fault) because I'm on mobile broadband until I get my phone line sorted out. Turbo helps me save money, as well as allowing me to surf at a sane speed. Is there a firefox or chrome equivalent to opera turbo that doesn't cost money? I'm using Opera 11.00, build 1156.

    Read the article

  • Synchronized Clipboard?

    - by Boris_yo
    Evernote, Dropbox, Xmarks and what not. Many know these but is there a software that can synchronize clipboard between devices? For what you might be asking? Here is for what: Sometimes I want URL that I see on my desktop/laptop computer to enter into laptop's/smartphone's browser but unless I save that URL to .TXT file, email URL or save to Evernote there is no other way to directly do that. So is there such software already?

    Read the article

  • Windows 7 showing subfiles when saving files from internet?

    - by mezamorphic
    Within the last week, whenever I go to save a file from the internet when I save to "Documents" it doesn't show just what is in My Documents at the higher level, it shows all the contents of every sub-folder. I haven't changed any settings- how can I get this back to normal (so that files of sub-folders are not shown when saving)? I tried with both Chrome and IE so it's not the browser doing it. The problem seems to only happen with My Documents, no other folders like Downloads etc.

    Read the article

  • Websites not working with Opera 11.00, major bugs

    - by s hanley
    Sites like ebay and even superuser stop working properly when I use opera 11.00. Menus stop working everywhere from ebay to godaddy. Hovering on a menu item doesn't expand it, no sub menu slides out. This makes a large number of very popular websites unusable. Am I right in assuming this is a javascript issue? I use opera for the turbo feature (I have tested opera with and without turbo so it's not turbo's fault) because I'm on mobile broadband until I get my phone line sorted out. Turbo helps me save money, as well as allowing me to surf at a sane speed. Is there a firefox or chrome equivalent to opera turbo that doesn't cost money? I'm using Opera 11.00, build 1156.

    Read the article

  • What is a good set and forget file version tracking / backup application for windows?

    - by tomwoods
    When I make changes to files, I keep on finding myself "saving as" and adding the current date to the file. It slows me down, and it creates a bunch of files that clot my folder. I would prefer to be able to Right Click on a file from the File Explorer and select to save different versions of this file, so that each time I save it, it saves a copy somewhere, that I can access in the future if necessary. Is there any application that achieves this?

    Read the article

  • Domain registration and DNS, what am I actually paying for? [on hold]

    - by jozxyqk
    Long story short I'm quite confused as to exactly what is offered by domain registration and dns service sites. When I go to the url "http://google.com", my PC connects to a name server and gets the IP for "google.com", then connects to the IP and says, give me the page for "http://google.com". AFAIK there are many name servers and they all cache these bits of information in some hierarchical network, but ultimately a DNS record must come from a single source (not sure what this is called). There are different kinds of records, that might not an IP but an alias/redirect to other records for example. Lets say I want my own domain name for some server. Maybe it even has a static IP but I want a nicer thing for people to remember, or my ISP assigns dynamic IPs and I want a URL that always works, or my website is hosted on a shared machine so the browser needs to send "http://mydnsname.com" to the webserver to distinguish it from other requests to the same IP but for different sites. Registering a domain costs a small amount of money per year. Where does this money go, not that I'm complaining :P? Is that really all it costs to maintain the entire DNS system of nameservers? If I just register the domain and nothing else, what do I get? Is that just reserving a name or hosting WHOIS information or have I paid for a dns recrord to be hosted? Can a domain alone have a record, such as an IP or be an alias to another? A bunch of sites out there offer other services, in addition to domain registration (I'm assuming they register the domain through another party for me). One example is "dynamic DNS" (DDNS), but isn't this just a regular DNS record that's updated regularly? Does it cost extra to update more often? Without a DDNS, can a DNS record still point to an IP? I've also seen the term "managed DNS" and have no idea where that fits in.

    Read the article

  • Copyrights concerning code snippets and larger amounts of code

    - by JustcallmeDrago
    I am designing a public code repository. Users will be allowed to post and edit whatever amount of code they want, from code snippets to entire multi-file projects. I have a few major legal concerns about this: Not getting sued/shut down - I feel the site would be a much easier target than tracking down an individual user to sue. I have looked around a bit and see links to legal info in the footer of each page is common. What specific things should I do--and what does does a site such as YouTube (which I see copyrighted material on all the time) do--for protection? Citing sources and editing sourced code - If a user wants to post code that isn't theirs, what concerns/safeguards should I have? Will a link suffice, and what do I need further to allow the code to be edited (to improve it for example)? What can happen if a user posts copyrighted code without citing it? Large chunks of code - What legal differences should I look out for as the amount grows? Not having a mess of licenses for the site - I would like to have a single license (like RosettaCode) that keeps things simple for interaction on the site. I want the code to be postable and editable. I have looked into StackOverflow's CreativeCommons license a little and it says that If you alter, transform, or build upon this work, you may distribute the resulting work only under the same or similar license to this one. And on RosettaCode: All software found on Rosetta Code should be considered potentially hazardous. Use at your own risk. Be aware that all code on Rosetta Code is under the GNU Free Documentation License, as are any edits made by contributors. See Rosetta Code:Copyrights for details. What other licenses are like this? Commercializing the site - In what ways can I and can't I make money off of a site that contains code like this? All code will be publicly visible. Initial thoughts are having ads or making money by charging for advanced features.

    Read the article

  • Understanding LINQ to SQL (11) Performance

    - by Dixin
    [LINQ via C# series] LINQ to SQL has a lot of great features like strong typing query compilation deferred execution declarative paradigm etc., which are very productive. Of course, these cannot be free, and one price is the performance. O/R mapping overhead Because LINQ to SQL is based on O/R mapping, one obvious overhead is, data changing usually requires data retrieving:private static void UpdateProductUnitPrice(int id, decimal unitPrice) { using (NorthwindDataContext database = new NorthwindDataContext()) { Product product = database.Products.Single(item => item.ProductID == id); // SELECT... product.UnitPrice = unitPrice; // UPDATE... database.SubmitChanges(); } } Before updating an entity, that entity has to be retrieved by an extra SELECT query. This is slower than direct data update via ADO.NET:private static void UpdateProductUnitPrice(int id, decimal unitPrice) { using (SqlConnection connection = new SqlConnection( "Data Source=localhost;Initial Catalog=Northwind;Integrated Security=True")) using (SqlCommand command = new SqlCommand( @"UPDATE [dbo].[Products] SET [UnitPrice] = @UnitPrice WHERE [ProductID] = @ProductID", connection)) { command.Parameters.Add("@ProductID", SqlDbType.Int).Value = id; command.Parameters.Add("@UnitPrice", SqlDbType.Money).Value = unitPrice; connection.Open(); command.Transaction = connection.BeginTransaction(); command.ExecuteNonQuery(); // UPDATE... command.Transaction.Commit(); } } The above imperative code specifies the “how to do” details with better performance. For the same reason, some articles from Internet insist that, when updating data via LINQ to SQL, the above declarative code should be replaced by:private static void UpdateProductUnitPrice(int id, decimal unitPrice) { using (NorthwindDataContext database = new NorthwindDataContext()) { database.ExecuteCommand( "UPDATE [dbo].[Products] SET [UnitPrice] = {0} WHERE [ProductID] = {1}", id, unitPrice); } } Or just create a stored procedure:CREATE PROCEDURE [dbo].[UpdateProductUnitPrice] ( @ProductID INT, @UnitPrice MONEY ) AS BEGIN BEGIN TRANSACTION UPDATE [dbo].[Products] SET [UnitPrice] = @UnitPrice WHERE [ProductID] = @ProductID COMMIT TRANSACTION END and map it as a method of NorthwindDataContext (explained in this post):private static void UpdateProductUnitPrice(int id, decimal unitPrice) { using (NorthwindDataContext database = new NorthwindDataContext()) { database.UpdateProductUnitPrice(id, unitPrice); } } As a normal trade off for O/R mapping, a decision has to be made between performance overhead and programming productivity according to the case. In a developer’s perspective, if O/R mapping is chosen, I consistently choose the declarative LINQ code, unless this kind of overhead is unacceptable. Data retrieving overhead After talking about the O/R mapping specific issue. Now look into the LINQ to SQL specific issues, for example, performance in the data retrieving process. The previous post has explained that the SQL translating and executing is complex. Actually, the LINQ to SQL pipeline is similar to the compiler pipeline. It consists of about 15 steps to translate an C# expression tree to SQL statement, which can be categorized as: Convert: Invoke SqlProvider.BuildQuery() to convert the tree of Expression nodes into a tree of SqlNode nodes; Bind: Used visitor pattern to figure out the meanings of names according to the mapping info, like a property for a column, etc.; Flatten: Figure out the hierarchy of the query; Rewrite: for SQL Server 2000, if needed Reduce: Remove the unnecessary information from the tree. Parameterize Format: Generate the SQL statement string; Parameterize: Figure out the parameters, for example, a reference to a local variable should be a parameter in SQL; Materialize: Executes the reader and convert the result back into typed objects. So for each data retrieving, even for data retrieving which looks simple: private static Product[] RetrieveProducts(int productId) { using (NorthwindDataContext database = new NorthwindDataContext()) { return database.Products.Where(product => product.ProductID == productId) .ToArray(); } } LINQ to SQL goes through above steps to translate and execute the query. Fortunately, there is a built-in way to cache the translated query. Compiled query When such a LINQ to SQL query is executed repeatedly, The CompiledQuery can be used to translate query for one time, and execute for multiple times:internal static class CompiledQueries { private static readonly Func<NorthwindDataContext, int, Product[]> _retrieveProducts = CompiledQuery.Compile((NorthwindDataContext database, int productId) => database.Products.Where(product => product.ProductID == productId).ToArray()); internal static Product[] RetrieveProducts( this NorthwindDataContext database, int productId) { return _retrieveProducts(database, productId); } } The new version of RetrieveProducts() gets better performance, because only when _retrieveProducts is first time invoked, it internally invokes SqlProvider.Compile() to translate the query expression. And it also uses lock to make sure translating once in multi-threading scenarios. Static SQL / stored procedures without translating Another way to avoid the translating overhead is to use static SQL or stored procedures, just as the above examples. Because this is a functional programming series, this article not dive into. For the details, Scott Guthrie already has some excellent articles: LINQ to SQL (Part 6: Retrieving Data Using Stored Procedures) LINQ to SQL (Part 7: Updating our Database using Stored Procedures) LINQ to SQL (Part 8: Executing Custom SQL Expressions) Data changing overhead By looking into the data updating process, it also needs a lot of work: Begins transaction Processes the changes (ChangeProcessor) Walks through the objects to identify the changes Determines the order of the changes Executes the changings LINQ queries may be needed to execute the changings, like the first example in this article, an object needs to be retrieved before changed, then the above whole process of data retrieving will be went through If there is user customization, it will be executed, for example, a table’s INSERT / UPDATE / DELETE can be customized in the O/R designer It is important to keep these overhead in mind. Bulk deleting / updating Another thing to be aware is the bulk deleting:private static void DeleteProducts(int categoryId) { using (NorthwindDataContext database = new NorthwindDataContext()) { database.Products.DeleteAllOnSubmit( database.Products.Where(product => product.CategoryID == categoryId)); database.SubmitChanges(); } } The expected SQL should be like:BEGIN TRANSACTION exec sp_executesql N'DELETE FROM [dbo].[Products] AS [t0] WHERE [t0].[CategoryID] = @p0',N'@p0 int',@p0=9 COMMIT TRANSACTION Hoverer, as fore mentioned, the actual SQL is to retrieving the entities, and then delete them one by one:-- Retrieves the entities to be deleted: exec sp_executesql N'SELECT [t0].[ProductID], [t0].[ProductName], [t0].[SupplierID], [t0].[CategoryID], [t0].[QuantityPerUnit], [t0].[UnitPrice], [t0].[UnitsInStock], [t0].[UnitsOnOrder], [t0].[ReorderLevel], [t0].[Discontinued] FROM [dbo].[Products] AS [t0] WHERE [t0].[CategoryID] = @p0',N'@p0 int',@p0=9 -- Deletes the retrieved entities one by one: BEGIN TRANSACTION exec sp_executesql N'DELETE FROM [dbo].[Products] WHERE ([ProductID] = @p0) AND ([ProductName] = @p1) AND ([SupplierID] IS NULL) AND ([CategoryID] = @p2) AND ([QuantityPerUnit] IS NULL) AND ([UnitPrice] = @p3) AND ([UnitsInStock] = @p4) AND ([UnitsOnOrder] = @p5) AND ([ReorderLevel] = @p6) AND (NOT ([Discontinued] = 1))',N'@p0 int,@p1 nvarchar(4000),@p2 int,@p3 money,@p4 smallint,@p5 smallint,@p6 smallint',@p0=78,@p1=N'Optimus Prime',@p2=9,@p3=$0.0000,@p4=0,@p5=0,@p6=0 exec sp_executesql N'DELETE FROM [dbo].[Products] WHERE ([ProductID] = @p0) AND ([ProductName] = @p1) AND ([SupplierID] IS NULL) AND ([CategoryID] = @p2) AND ([QuantityPerUnit] IS NULL) AND ([UnitPrice] = @p3) AND ([UnitsInStock] = @p4) AND ([UnitsOnOrder] = @p5) AND ([ReorderLevel] = @p6) AND (NOT ([Discontinued] = 1))',N'@p0 int,@p1 nvarchar(4000),@p2 int,@p3 money,@p4 smallint,@p5 smallint,@p6 smallint',@p0=79,@p1=N'Bumble Bee',@p2=9,@p3=$0.0000,@p4=0,@p5=0,@p6=0 -- ... COMMIT TRANSACTION And the same to the bulk updating. This is really not effective and need to be aware. Here is already some solutions from the Internet, like this one. The idea is wrap the above SELECT statement into a INNER JOIN:exec sp_executesql N'DELETE [dbo].[Products] FROM [dbo].[Products] AS [j0] INNER JOIN ( SELECT [t0].[ProductID], [t0].[ProductName], [t0].[SupplierID], [t0].[CategoryID], [t0].[QuantityPerUnit], [t0].[UnitPrice], [t0].[UnitsInStock], [t0].[UnitsOnOrder], [t0].[ReorderLevel], [t0].[Discontinued] FROM [dbo].[Products] AS [t0] WHERE [t0].[CategoryID] = @p0) AS [j1] ON ([j0].[ProductID] = [j1].[[Products])', -- The Primary Key N'@p0 int',@p0=9 Query plan overhead The last thing is about the SQL Server query plan. Before .NET 4.0, LINQ to SQL has an issue (not sure if it is a bug). LINQ to SQL internally uses ADO.NET, but it does not set the SqlParameter.Size for a variable-length argument, like argument of NVARCHAR type, etc. So for two queries with the same SQL but different argument length:using (NorthwindDataContext database = new NorthwindDataContext()) { database.Products.Where(product => product.ProductName == "A") .Select(product => product.ProductID).ToArray(); // The same SQL and argument type, different argument length. database.Products.Where(product => product.ProductName == "AA") .Select(product => product.ProductID).ToArray(); } Pay attention to the argument length in the translated SQL:exec sp_executesql N'SELECT [t0].[ProductID] FROM [dbo].[Products] AS [t0] WHERE [t0].[ProductName] = @p0',N'@p0 nvarchar(1)',@p0=N'A' exec sp_executesql N'SELECT [t0].[ProductID] FROM [dbo].[Products] AS [t0] WHERE [t0].[ProductName] = @p0',N'@p0 nvarchar(2)',@p0=N'AA' Here is the overhead: The first query’s query plan cache is not reused by the second one:SELECT sys.syscacheobjects.cacheobjtype, sys.dm_exec_cached_plans.usecounts, sys.syscacheobjects.[sql] FROM sys.syscacheobjects INNER JOIN sys.dm_exec_cached_plans ON sys.syscacheobjects.bucketid = sys.dm_exec_cached_plans.bucketid; They actually use different query plans. Again, pay attention to the argument length in the [sql] column (@p0 nvarchar(2) / @p0 nvarchar(1)). Fortunately, in .NET 4.0 this is fixed:internal static class SqlTypeSystem { private abstract class ProviderBase : TypeSystemProvider { protected int? GetLargestDeclarableSize(SqlType declaredType) { SqlDbType sqlDbType = declaredType.SqlDbType; if (sqlDbType <= SqlDbType.Image) { switch (sqlDbType) { case SqlDbType.Binary: case SqlDbType.Image: return 8000; } return null; } if (sqlDbType == SqlDbType.NVarChar) { return 4000; // Max length for NVARCHAR. } if (sqlDbType != SqlDbType.VarChar) { return null; } return 8000; } } } In this above example, the translated SQL becomes:exec sp_executesql N'SELECT [t0].[ProductID] FROM [dbo].[Products] AS [t0] WHERE [t0].[ProductName] = @p0',N'@p0 nvarchar(4000)',@p0=N'A' exec sp_executesql N'SELECT [t0].[ProductID] FROM [dbo].[Products] AS [t0] WHERE [t0].[ProductName] = @p0',N'@p0 nvarchar(4000)',@p0=N'AA' So that they reuses the same query plan cache: Now the [usecounts] column is 2.

    Read the article

  • Dilemma for growing a project: Open source volunteer developers VS closed source paid / revshare developers? [closed]

    - by giorgio79
    I am trying to grow my project, and I am vaccillating between some examples. Some options seem to be: 1. open sourcing the project to draw volunteer developers. Pros This would mean anyone can try and make some money off the code that would motivate them to contribute back and grow the project. Cons Existing bigger could easily copy and paste my work so far. They can also replicate without having access to the code, but that would take more time. I also thought of using AGPL license, but again, code can still be copied without redistribution. After all, enforcing a license costs a lot of money, and I cannot just say to a possible copycat that it seems you copied my code, show me what you got. 2. Keep the project closed source, but create some kind of a developer program where they get revshare Pros I keep the main rights for the project, but still generate interest by creating a developer program. Noone can copy code easily, just with some considerable effort, but make contributions easy as a breeze. I am also seeing many companies just open source a part of their projects, like Acquia does not open source its multisite setup, or github does not open source some of its core business. Cons Less attention from open source committed devs. Conclusion So option 2 seems the most secure, but would love some feedback.

    Read the article

  • The Red Gate and .NET Reflector Debacle

    - by Rick Strahl
    About a month ago Red Gate – the company who owns the NET Reflector tool most .NET devs use at one point or another – decided to change their business model for Reflector and take the product from free to a fully paid for license model. As a bit of history: .NET Reflector was originally created by Lutz Roeder as a free community tool to inspect .NET assemblies. Using Reflector you can examine the types in an assembly, drill into type signatures and quickly disassemble code to see how a particular method works.  In case you’ve been living under a rock and you’ve never looked at Reflector, here’s what it looks like drilled into an assembly from disk with some disassembled source code showing: Note that you get tons of information about each element in the tree, and almost all related types and members are clickable both in the list and source view so it’s extremely easy to navigate and follow the code flow even in this static assembly only view. For many year’s Lutz kept the the tool up to date and added more features gradually improving an already amazing tool and making it better. Then about two and a half years ago Red Gate bought the tool from Lutz. A lot of ruckus and noise ensued in the community back then about what would happen with the tool and… for the most part very little did. Other than the incessant update notices with prominent Red Gate promo on them life with Reflector went on. The product didn’t die and and it didn’t go commercial or to a charge model. When .NET 4.0 came out it still continued to work mostly because the .NET feature set doesn’t drastically change how types behave.  Then a month back Red Gate started making noise about a new Version Version 7 which would be commercial. No more free version - and a shit storm broke out in the community. Now normally I’m not one to be critical of companies trying to make money from a product, much less for a product that’s as incredibly useful as Reflector. There isn’t day in .NET development that goes by for me where I don’t fire up Reflector. Whether it’s for examining the innards of the .NET Framework, checking out third party code, or verifying some of my own code and resources. Even more so recently I’ve been doing a lot of Interop work with a non-.NET application that needs to access .NET components and Reflector has been immensely valuable to me (and my clients) if figuring out exact type signatures required to calling .NET components in assemblies. In short Reflector is an invaluable tool to me. Ok, so what’s the problem? Why all the fuss? Certainly the $39 Red Gate is trying to charge isn’t going to kill any developer. If there’s any tool in .NET that’s worth $39 it’s Reflector, right? Right, but that’s not the problem here. The problem is how Red Gate went about moving the product to commercial which borders on the downright bizarre. It’s almost as if somebody in management wrote a slogan: “How can we piss off the .NET community in the most painful way we can?” And that it seems Red Gate has a utterly succeeded. People are rabid, and for once I think that this outrage isn’t exactly misplaced. Take a look at the message thread that Red Gate dedicated from a link off the download page. Not only is Version 7 going to be a paid commercial tool, but the older versions of Reflector won’t be available any longer. Not only that but older versions that are already in use also will continually try to update themselves to the new paid version – which when installed will then expire unless registered properly. There have also been reports of Version 6 installs shutting themselves down and failing to work if the update is refused (I haven’t seen that myself so not sure if that’s true). In other words Red Gate is trying to make damn sure they’re getting your money if you attempt to use Reflector. There’s a lot of temptation there. Think about the millions of .NET developers out there and all of them possibly upgrading – that’s a nice chunk of change that Red Gate’s sitting on. Even with all the community backlash these guys are probably making some bank right now just because people need to get life to move on. Red Gate also put up a Feedback link on the download page – which not surprisingly is chock full with hate mail condemning the move. Oddly there’s not a single response to any of those messages by the Red Gate folks except when it concerns license questions for the full version. It puzzles me what that link serves for other yet than another complete example of failure to understand how to handle customer relations. There’s no doubt that that all of this has caused some serious outrage in the community. The sad part though is that this could have been handled so much less arrogantly and without pissing off the entire community and causing so much ill-will. People are pissed off and I have no doubt that this negative publicity will show up in the sales numbers for their other products. I certainly hope so. Stupidity ought to be painful! Why do Companies do boneheaded stuff like this? Red Gate’s original decision to buy Reflector was hotly debated but at that the time most of what would happen was mostly speculation. But I thought it was a smart move for any company that is in need of spreading its marketing message and corporate image as a vendor in the .NET space. Where else do you get to flash your corporate logo to hordes of .NET developers on a regular basis?  Exploiting that marketing with some goodwill of providing a free tool breeds positive feedback that hopefully has a good effect on the company’s visibility and the products it sells. Instead Red Gate seems to have taken exactly the opposite tack of corporate bullying to try to make a quick buck – and in the process ruined any community goodwill that might have come from providing a service community for free while still getting valuable marketing. What’s so puzzling about this boneheaded escapade is that the company doesn’t need to resort to underhanded tactics like what they are trying with Reflector 7. The tools the company makes are very good. I personally use SQL Compare, Sql Data Compare and ANTS Profiler on a regular basis and all of these tools are essential in my toolbox. They certainly work much better than the tools that are in the box with Visual Studio. Chances are that if Reflector 7 added useful features I would have been more than happy to shell out my $39 to upgrade when the time is right. It’s Expensive to give away stuff for Free At the same time, this episode shows some of the big problems that come with ‘free’ tools. A lot of organizations are realizing that giving stuff away for free is actually quite expensive and the pay back is often very intangible if any at all. Those that rely on donations or other voluntary compensation find that they amount contributed is absolutely miniscule as to not matter at all. Yet at the same time I bet most of those clamouring the loudest on that Red Gate Reflector feedback page that Reflector won’t be free anymore probably have NEVER made a donation to any open source project or free tool ever. The expectation of Free these days is just too great – which is a shame I think. There’s a lot to be said for paid software and having somebody to hold to responsible to because you gave them some money. There’s an incentive –> payback –> responsibility model that seems to be missing from free software (not all of it, but a lot of it). While there certainly are plenty of bad apples in paid software as well, money tends to be a good motivator for people to continue working and improving products. Reasons for giving away stuff are many but often it’s a naïve desire to share things when things are simple. At first it might be no problem to volunteer time and effort but as products mature the fun goes out of it, and as the reality of product maintenance kicks in developers want to get something back for the time and effort they’re putting in doing non-glamorous work. It’s then when products die or languish and this is painful for all to watch. For Red Gate however, I think there was always a pretty good payback from the Reflector acquisition in terms of marketing: Visibility and possible positioning of their products although they seemed to have mostly ignored that option. On the other hand they started this off pretty badly even 2 and a half years back when they aquired Reflector from Lutz with the same arrogant attitude that is evident in the latest episode. You really gotta wonder what folks are thinking in management – the sad part is from advance emails that were circulating, they were fully aware of the shit storm they were inciting with this and I suspect they are banking on the sheer numbers of .NET developers to still make them a tidy chunk of change from upgrades… Alternatives are coming For me personally the single license isn’t a problem, but I actually have a tool that I sell (an interop Web Service proxy generation tool) to customers and one of the things I recommend to use with has been Reflector to view assembly information and to find which Interop classes to instantiate from the non-.NET environment. It’s been nice to use Reflector for this with its small footprint and zero-configuration installation. But now with V7 becoming a paid tool that option is not going to be available anymore. Luckily it looks like the .NET community is jumping to it and trying to fill the void. Amidst the Red Gate outrage a new library called ILSpy has sprung up and providing at least some of the core functionality of Reflector with an open source library. It looks promising going forward and I suspect there will be a lot more support and interest to support this project now that Reflector has gone over to the ‘dark side’…© Rick Strahl, West Wind Technologies, 2005-2011

    Read the article

  • Data Quality Through Data Governance

    Data Quality Governance Data quality is very important to every organization, bad data cost an organization time, money, and resources that could be prevented if the proper governance was put in to place.  Data Governance Program Criteria: Support from Executive Management and all Business Units Data Stewardship Program  Cross Functional Team of Data Stewards Data Governance Committee Quality Structured Data It should go without saying but any successful project in today’s business world must get buy in from executive management and all stakeholders involved with the project. If management does not fully support a project because they see it is in there and the company’s best interest then they will remove/eliminate funding, resources and allocated time to work on the project. In essence they can render a project dead until it is official killed by the business. In addition, buy in from stake holders is also very important because they can cause delays increased spending in time, money and resources because they do not support a project. Data Stewardship programs are administered by a data steward manager who primary focus is to support, train and manage a cross functional data stewards team. A cross functional team of data stewards are pulled from various departments act to ensure that all systems work to ensure that an organization’s goals are achieved. Typically, data stewards are subject matter experts that act as mediators between their respective departments and IT. Data Quality Procedures Data Governance Committees are composed of data stewards, Upper management, IT Leadership and various subject matter experts depending on a company. The primary goal of this committee is to define strategic goals, coordinate activities, set data standards and offer data guidelines for the business. Data Quality Policies In 1997, Claudia Imhoff defined a Data Stewardship’s responsibility as to approve business naming standards, develop consistent data definitions, determine data aliases, develop standard calculations and derivations, document the business rules of the corporation, monitor the quality of the data in the data warehouse, define security requirements, and so forth. She further explains data stewards responsible for creating and enforcing polices on the following but not limited to issues. Resolving Data Integration Issues Determining Data Security Documenting Data Definitions, Calculations, Summarizations, etc. Maintaining/Updating Business Rules Analyzing and Improving Data Quality

    Read the article

  • Cost to licence characters or ships for a game

    - by Michael Jasper
    I am producing a game pitch document for a university game design class, and I am looking for examples of licencing cost for using characters or ships from other IP holders in a game. For example: cost of using an X-Wing in a game, licencing from Lucas cost of using the Enterprise in a game, licencing from Paramount cost of using the Space Shuttle (if any), licencing from Nasa EDIT The closest information I can find is from an article about Nights of the Old Republic, but isn't nearly specific enough for my needs: What Kotick means by Lucas being the principal beneficiary of the success of The Old Republic is that there are most likely clauses in the license agreement that give percentages, points, or another denomination of revenue out to Lucas and his people just for the Star Wars name, and that amount is presumed to be a great deal of money. Kotick is saying that because the cost of the license is so prohibitive, as he has personally had experience with in his position as CEO of Activision Blizzard, that EA will not be able to be profitable because of the hemorrhaging of money to the licensor. EDIT 2 Another vague source stating that FOX uses a "five-figure rule" (assuming between $10,000 - $99,000) It seems FOX, like most studios, will not license individuals to create new works based upon their products. They will only commission individuals of their choosing if they elect to branch out into expanded product lines related to those licenses. Alternately, they are open to making the licencing available to large corporations with access to global markets, but only if those corporations agree to what Ms Friedman called a "five-figure guarantee". Presumably this means that the corporation seeking the licensing must agree to pay a 5-figure sum for that license, and be confident that their product will sell enough volume to recoup that fee, and to produce sufficient profits to make the acquisition worth their while. Thank you!

    Read the article

< Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >