Search Results

Search found 12506 results on 501 pages for 'business transaction mana'.

Page 57/501 | < Previous Page | 53 54 55 56 57 58 59 60 61 62 63 64  | Next Page >

  • SEO Optimization - An Effective Tool For Business

    SEO optimization also known as Search Engine Optimization is an important part of internet marketing operation. And, if you are thinking of improving and endorsing your website than this is the best options which you can select. It has also become an effective tool for every company that is used for promoting their website.

    Read the article

  • Office and SharePoint 2010 Released to Business Customers

    Microsoft today released its Office 2010 and SharePoint 2010 products to enterprise customers as part of a global launch event....Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Is Your Business Found on Google?

    A lot of people think that if you have a website, you will be on Google. This is in fact not true. It takes effort to get listed on Google and a list of things done right will get you into the search engine's favor.

    Read the article

  • Is SEO Critical to Your Business? Consider This

    The number of people turning to the Internet to get answers to their questions is growing by leaps and bounds. Currently, the amount of people utilizing search engines like Google to find information, answers, shopping solutions, and more, sings to the tune of 64%! And as that number will undoubtedly grow, properly implementing Search Engine Optimization (SEO) practices into your marketing grows with it.

    Read the article

  • How SEO Will Change Your Business

    The best way through which they have managed to dominate the top search engine placements have been with carrying out online marketing also known as search engine optimization. SEO or search engine optimization is one of the foremost breakthroughs in online marketing where SEO companies help websites and companies rank on the top pages of all search engine queries.

    Read the article

  • SEO Tools to Help You in Your Business

    SEO and other online strategies are being used more and more by businesses today. As the market is developing and becoming more educated, it is more important to ensure that you are on top of your game and understand the impact the SEO and other online tactics have on your website. This article will discuss some of the things that you need to watch out for in your online activities.

    Read the article

  • Overly accessible and incredibly resource hungry relationships between business objects. How can I f

    - by Mike
    Hi, Firstly, This might seem like a long question. I don't think it is... The code is just an overview of what im currently doing. It doesn't feel right, so I am looking for constructive criticism and warnings for pitfalls and suggestions of what I can do. I have a database with business objects. I need to access properties of parent objects. I need to maintain some sort of state through business objects. If you look at the classes, I don't think that the access modifiers are right. I don't think its structured very well. Most of the relationships are modelled with public properties. SubAccount.Account.User.ID <-- all of those are public.. Is there a better way to model a relationship between classes than this so its not so "public"? The other part of this question is about resources: If I was to make a User.GetUserList() function that returns a List, and I had 9000 users, when I call the GetUsers method, it will make 9000 User objects and inside that it will make 9000 new AccountCollection objects. What can I do to make this project not so resource hungry? Please find the code below and rip it to shreds. public class User { public string ID {get;set;} public string FirstName {get; set;} public string LastName {get; set;} public string PhoneNo {get; set;} public AccountCollection accounts {get; set;} public User { accounts = new AccountCollection(this); } public static List<Users> GetUsers() { return Data.GetUsers(); } } public AccountCollection : IEnumerable<Account> { private User user; public AccountCollection(User user) { this.user = user; } public IEnumerable<Account> GetEnumerator() { return Data.GetAccounts(user); } } public class Account { public User User {get; set;} //This is public so that the subaccount can access its Account's User's ID public int ID; public string Name; public Account(User user) { this.user = user; } } public SubAccountCollection : IEnumerable<SubAccount> { public Account account {get; set;} public SubAccountCollection(Account account) { this.account = account; } public IEnumerable<SubAccount> GetEnumerator() { return Data.GetSubAccounts(account); } } public class SubAccount { public Account account {get; set;} //this is public so that my Data class can access the account, to get the account's user's ID. public SubAccount(Account account) { this.account = account; } public Report GenerateReport() { Data.GetReport(this); } } public static class Data { public static List<Account> GetSubAccounts(Account account) { using (var dc = new databaseDataContext()) { List<SubAccount> query = (from a in dc.Accounts where a.UserID == account.User.ID //this is getting the account's user's ID select new SubAccount(account) { ID = a.ID, Name = a.Name, }).ToList(); } } public static List<Account> GetAccounts(User user) { using (var dc = new databaseDataContext()) { List<Account> query = (from a in dc.Accounts where a.UserID == User.ID //this is getting the user's ID select new Account(user) { ID = a.ID, Name = a.Name, }).ToList(); } } public static Report GetReport(SubAccount subAccount) { Report report = new Report(); //database access code here //need to get the user id of the subaccount's account for data querying. //i've got the subaccount, but how should i get the user id. //i would imagine something like this: int accountID = subAccount.Account.User.ID; //but this would require the subaccount's Account property to be public. //i do not want this to be accessible from my other project (UI). //reading up on internal seems to do the trick, but within my code it still feels //public. I could restrict the property to read, and only private set. return report; } public static List<User> GetUsers() { using (var dc = new databaseDataContext()) { var query = (from u in dc.Users select new User { ID = u.ID, FirstName = u.FirstName, LastName = u.LastName, PhoneNo = u.PhoneNo }).ToList(); return query; } } }

    Read the article

  • Pros and Cons on where to place business logic: app level or DB

    - by Juri
    Hi, I always again encounter discussions about where to place the business logic: inside a business layer in the application code or down in the DB in terms of stored procedures. Personally I'd tend to the 1st approach, but I'd like to hear some opinions from your part first, without influencing you with my personal views. I know there doesn't exist a one-size-fits-all solution and it often depends on many factors, but we can discuss about that. Btw, we are in the context of web applications and our current approach is to have UI layer which accepts UI input and does a first, client-side validation Business layer with a number of service-classes which contains the business logic including validation for user input (server-side) Data Access Layer which calls stored procedures from the DB for doing persistency/read operations Many people however tend to move the business layer stuff (especially regarding the validation) down to the DB in terms of stored procedures. What do you think about it? I'd like to discuss.

    Read the article

  • What are good ways to guarantee business continuity with a SaaS product?

    - by tommyvdz
    For my Bachelor Thesis I am researching how SaaS providers can arrange some sort of business continuity guarantee. You probably know the Source Code Escrow arrangements for 'shrink-wrapped' software. They give customers access to the source code and all applicable documentation whenever the software supplier gets into (financial) trouble. This clearly does not work for SaaS, because customers have no use for just the source code, and customers probably can not afford not being able to login to their CRM system for a couple weeks because the SaaS provider went bankrupt. I am currently researching different methods to solve this problem. Do you know good and practical solutions to solve this continuity problem? Or companies that already offer a good solution? Thanks!

    Read the article

  • Concurency issues with scheduling app

    - by Sazug
    Our application needs a simple scheduling mechanism - we can schedule only one visit per room for the same time interval (but one visit can be using one or more rooms). Using SQL Server 2005, sample procedure could look like this: CREATE PROCEDURE CreateVisit @start datetime, @end datetime, @roomID int AS BEGIN DECLARE @isFreeRoom INT BEGIN TRANSACTION SELECT @isFreeRoom = COUNT(*) FROM visits V INNER JOIN visits_rooms VR on VR.VisitID = V.ID WHERE @start = start AND @end = [end] AND VR.RoomID = @roomID IF (@isFreeRoom = 0) BEGIN INSERT INTO visits (start, [end]) VALUES (@start, @end) INSERT INTO visits_rooms (visitID, roomID) VALUES (SCOPE_IDENTITY(), @roomID) END COMMIT TRANSACTION END In order to not have the same room scheduled for two visits at the same time, how should we handle this problem in procedure? Should we use SERIALIZABLE transaction isolation level or maybe use table hints (locks)? Which one is better?

    Read the article

  • What is the business case for a dependency injection (DI) framework?

    - by kalkie
    At my company we want to start using a dependency injection (DI) framework for managing our dependencies. I have some difficulty with explaining the business value of such a framework. Currently I have come up with these reasons. Less source code, delete all the builder patterns in the code. Increase in flexibility. Easier to switch dependencies. Better separation of concern. The framework is responsible for creating instances instead of our code. Has anybody else had to persuade management? How did you do that? What reasons did you use?

    Read the article

  • Can I get a reference to a pending transaction from a SqlConnection object?

    - by Rune
    Hey, Suppose someone (other than me) writes the following code and compiles it into an assembly: using (SqlConnection conn = new SqlConnection(connString)) { conn.Open(); using (var transaction = conn.BeginTransaction()) { /* Update something in the database */ /* Then call any registered OnUpdate handlers */ InvokeOnUpdate(conn); transaction.Commit(); } } The call to InvokeOnUpdate(IDbConnection conn) calls out to an event handler that I can implement and register. Thus, in this handler I will have a reference to the IDbConnection object, but I won't have a reference to the pending transaction. Is there any way in which I can get a hold of the transaction? In my OnUpdate handler I want to execute something similar to the following: private void MyOnUpdateHandler(IDbConnection conn) { var cmd = conn.CreateCommand(); cmd.CommandText = someSQLString; cmd.CommandType = CommandType.Text; cmd.ExecuteNonQuery(); } However, the call to cmd.ExecuteNonQuery() throws an InvalidOperationException complaining that "ExecuteNonQuery requires the command to have a transaction when the connection assigned to the command is in a pending local transaction. The Transaction property of the command has not been initialized". Can I in any way enlist my SqlCommand cmd with the pending transaction? Can I retrieve a reference to the pending transaction from the IDbConnection object (I'd be happy to use reflection if necessary)?

    Read the article

  • Difference between SET autocommit=1 and START TRANSACTION in mysql (Have I missed something?)

    - by tkolar
    Hey there, I am reading up on transactions in mysql and am not sure whether I have grasped something specific correctly, and I want to be sure I understood that correctly, so here goes. I know what a transaction is supposed to do, I'm just not sure whether I understood the statement semantics or not. So, my question is, is anything wrong, (and, if that is the case, what is wrong) with the following: By default, autocommit mode is enabled in mysql. Now, SET autocommit=0; will begin a transaction, SET autocommit=1; will implicitly commit. It is possible to COMMIT; as well as ROLLBACK;, in both of which cases autocommit is still set to 0 afterwards (and a new transaction is implicitly started). START TRANSACTION; will basically SET autocommit=0; until a COMMIT; or ROLLBACK; takes place. In other words, START TRANSACTION; and SET autocommit=0; are equivalent, except for the fact that START TRANSACTION; does the equivalent of implicitly adding a SET autocommit=0; after COMMIT; or ROLLBACK; If that is the case, I don't understand http://dev.mysql.com/doc/refman/5.5/en/set-transaction.html#isolevel_serializable - seeing as having an isolation level implies that there is a transaction, meaning that autocommit should be off anyway? And if there is another difference (other than the one described above) between beginning a transaction and setting autocommit, what is it? Thanks a lot in advance for your help!

    Read the article

  • How can I force a merge of all WAL files in pg_xlog back into my base "data" directory?

    - by Zac B
    Question: Is there a way to tell Postgres (9.2) to "merge all WAL files in pg_xlog back into the non-WAL data files, and then delete all WAL files successfully merged?" I would like to be able to "force" this operation; i.e. checkpoint_segments or archiving settings should be ignored. The filesystem WAL buffer (pg_xlog) directory should be emptied, or nearly emptied. It's fine if some or all of the space consumed by the pg_xlog directory is then consumed by the data directory; our DBA has asked for WAL database backups without any backlogged WALs, but space consumption is not a concern. Having near-zero WAL activity during this operation is a fine constraint. I can ensure that the database server is either shut down or not connectible (zero user-generated transaction load) during this process. Essentially, I'd like Postgres to ignore archiving/checkpoint retention policies temporarily, and flush all WAL activity to the core database files, leaving pg_xlog in the same state as if the database were recently created--with very few WAL files. What I've Tried: I know that the pg_basebackup utility performs something like this (it generates an almost-all-WALs-merged copy of a Postgres instance's data directory), but we aren't ready to use it on all our systems yet, as we are still testing replication settings; I'm hoping for a more short-term solution. I've tried issuing CHECKPOINT commands, but they just recycle one WAL file and replace it with another (that is, if they do anything at all; if I issue them during database idle time, they do nothing). pg_switch_xlog() similarly just forces a switch to the next log segment; it doesn't flush all queued/buffered segments. I've also played with the pg_resetxlog utility. That utility sort of does what I want, but all of its usage docs seem to indicate that it destroys (rather than flushing out of the transaction log and into the main data files) some or all of the WAL data. Is that impression accurate? If not, can I use pg_resetxlog during a zero-WAL-activity period to force a flush of all queued WAL data to non-WAL data? If the answer to that is negative, how can I achieve this goal? Thanks!

    Read the article

  • Default Transaction Timeout

    - by MattH
    I used to set Transaction timeouts by using TransactionOptions.Timeout, but have decided for ease of maintenance to use the config approach: <system.transactions> <defaultSettings timeout="00:01:00" /> </system.transactions> Of course, after putting this in, I wanted to test it was working, so reduced the timeout to 5 seconds, then ran a test that lasted longer than this - but the transaction does not appear to abort! If I adjust the test to set TransactionOptions.Timeout to 5 seconds, the test works as expected After Investigating I think the problem appears to relate to TransactionOptions.Timeout, even though I'm no longer using it. I still need to use TransactionOptions so I can set IsolationLevel, but I no longer set the Timeout value, if I look at this object after I create it, the timeout value is 00:00:00, which equates to infinity. Does this mean my value set in the config file is being ignored? To summarise: Is it impossible to mix the config setting, and use of TransactionOptions If not, is there any way to extract the config setting at runtime, and use this to set the Timeout property [Edit] OR Set the default isolation-level without using TransactionOptions

    Read the article

< Previous Page | 53 54 55 56 57 58 59 60 61 62 63 64  | Next Page >