Search Results

Search found 1639 results on 66 pages for 'csv'.

Page 56/66 | < Previous Page | 52 53 54 55 56 57 58 59 60 61 62 63  | Next Page >

  • Drupal 7: One-time user account

    - by Noob
    I'm going to create a survey in Drupal 7 with the webform module, installed on a debian system which may be adapted in every way. The users (personally known, approx. 120) doing that survey will walk into a room and complete the survey in browsers on different computers. After that, they'll leave the room and other persons will enter, complete the survey on the same computers and so on. Each user may enter only one submission. The process needs to be anonymous, i. e. I mustn't have any idea of who did wich submission. My current solution is to generate random one-time-passwords and hand out one password per user (without noting who got which password). Within the survey there will be a password field where the one-time-password is entered. The value is checked by webform to be unique. I'll get the data via csv or Excel and verify the passwords manually in excel by comparing them to the list of valid passwords. The problem is: I don't like the idea of manually generating the password list, copying it to excel and doing a manual check. That's a good idea for one-time-use, but we're going to repeat the survey every once in a while. I'd rather generate one-time-logins (like user0001/fdlkjewf, user0002/dfrefnnr, ...) for each survey, hand them out to the users and let drupal/debian/whatever check whether a submission is valid or not. Do you have any idea how to batch-generate about 120 users with one-time-passwords in Drupal 7 and verify that each user may submit the form only once? Do you even have a better idea how to accomplish the task within the intranet? Thank you for your help.

    Read the article

  • What to look for in a switch with LAN/WAN verses an iSCSI SAN?

    - by Luke
    I'm setting up a VMWare ESXi 5 environment with 3 server nodes. Dell recommended 2x Force10 S60 switches shared (iSCSI SAN, LAN/WAN). The S60 switches are extremely powerful. They have 1.25 GB of buffer cache, < 9us latency. But they are very expensive (online price ~$15k per switch, actual quote a little less). I've been told that "by the book" you should at least have 2 internal switches for SAN, and 2 switches for LAN/WAN (each with a redundant). I know some of the pros and cons of each approach. What I'm wondering is, would it be more cost effective to disjoin the SAN from LAN with less expensive switches? The answer to this question highlights what I should be looking for in a switch for the SAN. What should I be looking for in a LAN/WAN switch, in comparison to the SAN? With the above linked question for the SAN: How is buffer latency measured? When you see 36 MB of buffer cache, is that shared or per port? So 36 MB would be 768kb or 36MB per port? With 3 to 6 servers how much buffer cache do you really need? What else should I be looking at? Our application will be heavily using HTML5 websockets (high number of persistent connections). The amount of data being sent is small; Data sent between client <- server isn't broadcasted (not a chat/IM service). We will be doing some database reporting too (csv export, sums, some joins). We are a small business and on a budget. We'd probably only be able to spend no more than $20k on switches total (2 or 4).

    Read the article

  • How to view / enumerate / obtain a list of all effective rights / permissions on an Active Directory object?

    - by Laura
    I am new to Server Fault and was hoping to find an answer to a question that I have been struggling with for the past week or so. I have been recently asked by my management to furnish a list of all the effective rights / permissions delegated on the Active Directory object for our Domain Admins group. I initially figured I'd use the Effective Permissions Tab in Active Directory Users and Computers but had two problems with it. The first was that it doesn't seem very accurate and the second was that it requires me to enter the name of a specific user, and it only shows me what it figures are effective permissions for that user. Now, we have more than a 1000 users in our environment so there's no way I can possibly enter 1000 user names one by one. Plus, there is no way to export that information either. I also looked at dsacls from MS but it doesn't do effective permissions. Someone pointed me to a tool called ADUCAdmin but that seems to falsely claim to do effective permissions. Could someone kindly help me find a way to obtain this listing? Basically, I need to generate a list of all the modify effective permissions granted on the Domain Admins group object along with the list of all the admins to which these permissions are granted. In case it helps, I don't need a fancy listing - simple text / CSV output would be enough I would be grateful for any assistance since this is time and security sensitive for us.

    Read the article

  • Transfering Files to server IP and port

    - by Mason
    I need to transfer files from my local computer on windows 7 to a server running linux. I access the server with putty through ssh at a specific IPv4 address and port number. I've attempted using the pscp command from my local computer but was denied access by the server. "Fatal: Network error: Connection refused" c:>pscp test.csv userid@**IPv4_Addres***:Port# /path/destination_file_name. Either the server blocks all pscp attempts from unauthorized users (most likely my laptop included) or I used the command incorrectly. If you have experience using this command, where exactly will the file get transfered to, I'm assuming that the path destination starts at my home directory in the server. Also if you have any other alternative methods of transfering the files let me know. Update 1 I have also tried using WinSCP however I got permission denied for that as well, it looks like the server will not let me upload or save files. Solved I had a complete lapse of memory and forgot about sudo (spent too much time with scripts the last 2 months), so I was able to change the permissions to allow external editing. Thanks for all the help guys!

    Read the article

  • Fill down in Excel, but based on multiple values

    - by Jenn D.
    I have spreadsheets (not created by me) that have blank entries in one column where they should really have data. I want to take every empty cell and fill it with the nearest value above it. I'm looking for as little manual intervention as possible, because I'll have to do it repeatedly. I thought some previous version of Excel, or maybe another spreadsheet from the distant past, would do this by default -- that is, if you selected the column with foo and bar, and chose the equivalent of "fill down", you would get what's in the WANT column. What I actually get in Excel is the GET column. HAVE: WANT: GET: foo 1 foo 1 foo 1 2 foo 2 foo 2 bar 1 bar 1 foo 1 2 bar 2 foo 2 3 bar 3 foo 3 I'm worried that this might need a macro to be done properly. I used to be a whiz with Excel macros, and then suddenly they were all in VB. My fallback position will be to dump the whole thing to CSV and write a Python script, but if there's any way to do it in Excel that would be much preferable. Even if it involves a couple of different manual steps, that's fine; just not one step per group of lines. That is, a process of "copy the column, do X to it, cut and paste it back" would work, but "do X for each occurrence of foo or bar" won't. The files are too big for that. Any thoughts are appreciated!

    Read the article

  • email attachments [closed]

    - by Alan Doolan
    My company currently use software on a local machine that will take an email from the email server, extract the attachment, rename it and then add it to a folder on a webserver using ftp. This works well but they are currently asking if it can be done 'in the cloud' or what they really mean, not local. Is there any thing that would do this on the server itself? I should clarify a bit. The attachements are various reports that are being sent to different email addresses (mostly google corporate and free accounts). We need the reports to be on a folder on a webserver so that internal pages can take the information in the reports (csv) and use it on the webpages or adds them to a separate database. The key part being that the files need to be in the particular folders. Though it does work to have a computer running software that will take the files, renames them to the required name and uploads them to the folder it relies too heavily on one computer working all the time. This is not something we can depend on at this point. I'll be honest, I'm a web developer and not strong with server systems past my particular standard requirements so this is beyond me. though yes, I am aware that my boss is not 100% sure what 'cloud' means but likes the word.

    Read the article

  • Extracting information from active directory

    - by Nop at NaDa
    I work in the IT support department of a branch of a huge company. I have to take care of a database with all the users, computers, etc. I'm trying to find a way to automatically update the database as much as possible, but the IT infrastructure guys doesn't give me enough privileges to use Active Directory in order to dump the users, nor they have the time to give me the information that I need. Some days ago I found Active Directory explorer from Sysinternals that allows me to browse through Active Directory, and I found all the information that I need there (username, real name, date when it was created, privileges, company, etc.). Unfortunately I'm unable to export the data to a human readable format. I'm just able to take a snapshot of the whole database in a machine-readable format. Doing the snapshot takes hours and I'm afraid that the infrastructure guys won't like me doing entire snapshots on a regular basis. Do you know of any tool (command-line is preferable) that would allow me to retrieve the values of the keys or export it to XML, CSV, etc?

    Read the article

  • Why is only one Excel spreadsheet crippled, but others are fine?

    - by Dallas
    I have an inherited spreadsheet that I really don't want to rebuild at the moment. It's a simple small workbook that is small (< 200 rows that don't even reach to AA) and does nothing more than calculate some totals within the same worksheets. No macros, no external data sources, nothing beyond basic formatting of dates, numbers and strings. I see importing data from CSV/text has created many many workbook connections over time, but even if I delete them all (there were hundreds) it makes no difference in performance. Even clicking to simply change focus from cell to cell takes 10+ seconds, adorned by the spinning cursor and (Not Responding) appending to the title bar and the application locking up. The program seems to "recover" every time, but efficiency of editing this file is obviously seriously handicapped. All other files seem fine in Excel, and other programs have no apparent performance issues. I see Excel is chewing up CPU but I'm not sure how to narrow down what process or service is "clashing" with Excel. I tried the same file on other computers and performance is fine. If I turn off all start-up services and run only Excel, performance is restored... until I start using other programs and then it bogs down again. At this point, I would entertain almost any idea, theory or suggestion that helps pinpoint, solve or work around the issue.

    Read the article

  • Sever 2008 R2 Powershell Script runs manually, but not as a scheduled task

    - by Aeisor
    I have a Powershell script that runs manually using the Powershell ISE; However, when run as a scheduled task using an administrator's credentials the task does not run with the expected results. The script: $request=new-object System.Net.WebClient $request.DownloadFile("...url...", "C:\path\to\file.csv") The administrator user has Full Control of both the script and the folder it is writing to. The url exists and responds in a reasonable time (<1s). If I run the task manually the status is 0x41301 ("Currently Running") until I eventually end it. I have set the task up using both of these methods: Start a Program: C:\path\to\PS.PS1 Start a Program: C:\windows\system32\WindowsPowerShell\v1.0\powershell.exe with additional options -noninteractive -command "C:\path\to\PS.PS1" Using option 1, the task history shows it has opened an instance of notepad.exe but never terminates it. Using option 2 it completes the task but doesn't download / create the file. I have used Set-ExecutionPolicy Unrestricted as this is not a signed script. Any ideas?

    Read the article

  • Permissions Issue with Files Generated by PerfMon

    - by SvrGuy
    We are trying to implement some data logging to CSV files using a Data Collector Set in PerfMon (on a windows Server 2008R2 system). The issue we are running into is that we (seemingly) can't control the permissions being set on the log files created by perfmon. What we want is for the log files created by perfmon to have Everyone:F permissions (Full Control for Everyone). So, we have a directory structure setup where all logs go into a folder: c:\vms\PerfMonLogs\%MACHINENAME% (e.g. c:\vms\PerfMonLogs\EvaluationG2) In the above example, c:\vms\PerfMonLogs\EvaluationG2 has permissions Everyone:F (below is the icacls for this directory) EVALUATIONG2/ Everyone:(OI)(CI)(F) NT AUTHORITY\SYSTEM:(OI)(CI)(F) BUILTIN\Administrators:(OI)(CI)(F) BUILTIN\Performance Log Users:(OI)(R) When the data collector set runs, it creates new sub folders and files within c:\vms\PerfMonLogs\EvaluationG2, e.g. (C:\vms\PerfMonLogs\EVALUATIONG2\M11d26y2012N3) Each of these directories and files has the following permissions: M11d26y2012N3 NT AUTHORITY\SYSTEM:(OI)(CI)(F) BUILTIN\Administrators:(OI)(CI)(F) BUILTIN\Performance Log Users:(OI)(R) So these new folders and not simply inheriting permissions from the parent folder (don't know why). Now, we tried adding Everyone:F using the security tab on the collector set (No dice). Any ideas? How do we control the permissions on the log files generated by perfmon data collector set?

    Read the article

  • Dumping active directory

    - by Nop at NaDa
    I work in the IT support department of a branch of a huge company. I have to take care of a database with all the users, computers, etc. I'm trying to find a way to automatically update the database as much as possible, but the IT infrastructure guys doesn't give me enough privileges to use Active Directory in order to dump the users, nor they have the time to give me the information that I need. Some days ago I found Active Directory explorer from Sysinternals that allows me to browse through Active Directory, and I found all the information that I need there (username, real name, date when it was created, privileges, company, etc.). Unfortunately I'm unable to export the data to a human readable format. I'm just able to take a snapshot of the whole database in a machine-readable format. Doing the snapshot takes hours and I'm afraid that the infrastructure guys won't like me doing entire snapshots on a regular basis. Do you know of any tool (command-line is preferable) that would allow me to retrieve the values of the keys or export it to XML, CSV, etc?

    Read the article

  • SQL SERVER – How to Roll Back SQL Server Database Changes

    - by Pinal Dave
    In a perfect scenario, no unexpected and unplanned changes occur. There are no unpleasant surprises, no inadvertent changes. However, even with all precautions and testing, there is sometimes a need to revert a structure or data change. One of the methods that can be used in this situation is to use an older database backup that has the records or database object structure you want to revert to. For this method, you have to have the adequate full database backup and a tool that will help you with comparison and synchronization is preferred. In this article, we will focus on another method: rolling back the changes. This can be done by using: An option in SQL Server Management Studio T-SQL, or ApexSQL Log The first two solutions have been described in this article The disadvantages of these methods are that you have to know when exactly the change you want to revert happened and that all transactions on the database executed in a specific time range are rolled back – the ones you want to undo and the ones you don’t. How to easily roll back SQL Server database changes using ApexSQL Log? The biggest challenge is to roll back just specific changes, not all changes that happened in a specific time range. While SQL Server Management Studio option and T-SQL read and roll forward all transactions in the transaction log files, I will show you a solution that finds and scripts only the specific changes that match your criteria. Therefore, you don’t need to worry about all other database changes that you don’t want to roll back. ApexSQL Log is a SQL Server disaster recovery tool that reads transaction logs and provides a wide range of filters that enable you to easily rollback only specific data changes. First, connect to the online database where you want to roll back the changes. Once you select the database, ApexSQL Log will show its recovery model. Note that changes can be rolled back even for a database in the Simple recovery model, when no database and transaction log backups are available. However, ApexSQL Log achieves best results when the database is in the Full recovery model and you have a chain of subsequent transaction log backups, back to the moment when the change occurred. In this example, we will use only the online transaction log. In the next step, use filters to read only the transactions that happened in a specific time range. To remove noise, it’s recommended to use as many filters as possible. Besides filtering by the time of the transaction, ApexSQL Log can filter by the operation type: Table name: As well as transaction state (committed, aborted, running, and unknown), name of the user who committed the change, specific field values, server process IDs, and transaction description. You can select only the tables affected by the changes you want to roll back. However, if you’re not certain which tables were affected, you can leave them all selected and once the results are shown in the main grid, analyze them to find the ones you to roll back. When you set the filters, you can select how to present the results. ApexSQL Log can automatically create undo or redo scripts, export the transactions into an XML, HTML, CSV, SQL, or SQL Bulk file, and create a batch file that you can use for unattended transaction log reading. In this example, I will open the results in the grid, as I want to analyze them before rolling back the transactions. The results contain information about the transaction, as well as who and when made it. For UPDATEs, ApexSQL Log shows both old and new values, so you can easily see what has happened. To create an UNDO script that rolls back the changes, select the transactions you want to roll back and click Create undo script in the menu. For the DELETE statement selected in the screenshot above, the undo script is: INSERT INTO [Sales].[PersonCreditCard] ([BusinessEntityID], [CreditCardID], [ModifiedDate]) VALUES (297, 8010, '20050901 00:00:00.000') When it comes to rolling back database changes, ApexSQL Log has a big advantage, as it rolls back only specific transactions, while leaving all other transactions that occurred at the same time range intact. That makes ApexSQL Log a good solution for rolling back inadvertent data and schema changes on your SQL Server databases. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: ApexSQL

    Read the article

  • Thoughts on ODTUG Kscope12

    - by thatjeffsmith
    The rodeo rocked! This wasn’t my first rodeo, and it wasn’t my first Kscope, but it was probably my favorite one. What is Kscope? It’s the annual conference for the Oracle Development Tools User Group. 1,000+ attendees from 20+ countries with an average Jeff Klout score of 65. I just made that metric and score up, but this conference attracts the best and brightest in the Oracle database space. I’m not just talking about the speakers either. The attendees are all top notch. They actively participate in sessions, make an effort to get to know their fellow conference mates, and often turn into volunteers and speakers. Developers that enjoy unit testing, understand the importance of modeling your data, and are eager to understand the Oracle CBO – these are traits that describe the ‘average’ ODTUG developer. 2012′s event was held in San Antonio. Yes, it was very hot. But this might have been the nicest Marriott property I’ve ever visited, and I’ve stayed at some nice ones in Hawaii and St. Thomas. They had free WiFi everywhere – the rooms, the Conference Center, the lobby, bars, everywhere. And it worked. The after hours events were very fun. I embarrassed myself several times, but that’s OK. The rodeo was an awesome event and the Thirsty Games experience was something I hope does not make it onto YouTube or Facebook — talking to you Chet Justice. I finally got to meet and spend some time with some folks I’ve always wanted to get to know better, @timothyjgorman, @alexgorbachev, @lj_dobson, @dschleis, @kentGraziano, @chriscmuir, @GaloBalda, @patch72, and many, many more! I even made some new friends thanks to the Mentor program and @carol_finn. 2013′s event will be in New Orleans. If you haven’t joined ODTUG or haven’t made it to Kscope, go ahead and mark your calendars. I had 3 presentations this year. Sunday’s was not a good performance, and I want to apologize to anyone who was there and was hoping for more. My Tips and Debugging sessions on Monday and Tuesday were more to my liking, and I enjoyed them as a presenter. I hope you enjoyed them as an attendee. I understand that my slidedecks were corrupted on the ODTUG site, and I’m working with the coordinator now to get those fixed ASAP. Apparently the 2 most well-received Tips was the /*CSV*/ formatting hint and recalling your previous SQL history via the keyboard. I’ll be doing a follow-up webinar with ODTUG in a few weeks for those members that weren’t able to see my Tips and Debugger sessions in San Antonio. I’ll be sure to post details on that here when I have the details. My next scheduled conference is Oracle Open World, and I may have a couple of shows after that to round out 2012.

    Read the article

  • SSIS Technique to Remove/Skip Trailer and/or Bad Data Row in a Flat File

    - by Compudicted
    I noticed that the question on how to skip or bypass a trailer record or a badly formatted/empty row in a SSIS package keeps coming back on the MSDN SSIS Forum. I tried to figure out the reason why and after an extensive search inside the forum and outside it on the entire Web (using several search engines) I indeed found that it seems even thought there is a number of posts and articles on the topic none of them are employing the simplest and the most efficient technique. When I say efficient I mean the shortest time to solution for the fellow developers. OK, enough talk. Let’s face the problem: Typically a flat file (e.g. a comma delimited/CSV) needs to be processed (loaded into a database in most cases really). Oftentimes, such an input file is produced by some sort of an out of control, 3-rd party solution and would come in with some garbage characters and/or even malformed/miss-formatted rows. One such example could be this imaginary file: As you can see several rows have no data and there is an occasional garbage character (1, in this example on row #7). Our task is to produce a clean file that will only capture the meaningful data rows. As an aside, our output/target may be a database table, but for the purpose of this exercise we will simply re-format the source. Let’s outline our course of action to start off: Will use SSIS 2005 to create a DFT; The DFT will use a Flat File Source to our input [bad] flat file; We will use a Conditional Split to process the bad input file; and finally Dump the resulting data to a new [clean] file. Well, only four steps, let’s see if it is too much of work. 1: Start the BIDS and add a DFT to the Control Flow designer (I named it Process Dirty File DFT): 2, and 3: I had added the data viewer to just see what I am getting, alas, surprisingly the data issues were not seen it:   What really is the key in the approach it is to properly set the Conditional Split Transformation. Visually it is: and specifically its SSIS Expression LEN([After CS Column 0]) > 1 The point is to employ the right Boolean expression (yes, the Conditional Split accepts only Boolean conditions). For the sake of this post I re-named the Output Name “No Empty Rows”, but by default it will be named Case 1 (remember to drag your first column into the expression area)! You can close your Conditional Split now. The next part will be crucial – consuming the output of our Conditional Split. Last step - #4: Add a Flat File Destination or any other one you need. Click on the Conditional Split and choose the green arrow to drop onto the target. When you do so make sure you choose the No Empty Rows output and NOT the Conditional Split Default Output. Make the necessary mappings. At this point your package must look like: As the last step will run our package to examine the produced output file. F5: and… it looks great!

    Read the article

  • New Analytic settings for the new code

    - by Steve Tunstall
    If you have upgraded to the new 2011.1.3.0 code, you may find some very useful settings for the Analytics. If you didn't already know, the analytic datasets have the potential to fill up your OS hard drives. The more datasets you use and create, that faster this can happen. Since they take a measurement every second, forever, some of these metrics can get in the multiple GB size in a matter of weeks. The traditional 'fix' was that you had to go into Analytics -> Datasets about once a month and clean up the largest datasets. You did this by deleting them. Ouch. Now you lost all of that historical data that you might have wanted to check out many months from now. Or, you had to export each metric individually to a CSV file first. Not very easy or fun. You could also suspend a dataset, and have it not collect data at all. Well, that fixed the problem, didn't it? of course you now had no data to go look at. Hmmmm.... All of this is no longer a concern. Check out the new Settings tab under Analytics... Now, I can tell the ZFSSA to keep every second of data for, say, 2 weeks, and then average those 60 seconds of each minute into a single 'minute' value. I can go even further and ask it to average those 60 minutes of data into a single 'hour' value.  This allows me to effectively shrink my older datasets by a factor of 1/3600 !!! Very cool. I can now allow my datasets to go forever, and really never have to worry about them filling up my OS drives. That's great going forward, but what about those huge datasets you already have? No problem. Another new feature in 2011.1.3.0 is the ability to shrink the older datasets in the same way. Check this out. I have here a dataset called "Disk: I/O opps per second" that is about 6.32M on disk (You need not worry so much about the "In Core" value, as that is in RAM, and it fluctuates all the time. Once you stop viewing a particular metric, you will see that shrink over time, just relax).  When one clicks on the trash can icon to the right of the dataset, it used to delete the whole thing, and you would have to re-create it from scratch to get the data collecting again. Now, however, it gives you this prompt: As you can see, this allows you to once again shrink the dataset by averaging the second data into minutes or hours. Here is my new dataset size after I do this. So it shrank from 6.32MB down to 2.87MB, but i can still see my metrics going back to the time I began the dataset. Now, you do understand that once you do this, as you look back in time to the minute or hour data metrics, that you are going to see much larger time values, right? You will need to decide what size of granularity you can live with, and for how long. Check this out. Here is my Disk: Percent utilized from 5-21-2012 2:42 pm to 4:22 pm: After I went through the delete process to change everything older than 1 week to "Minutes", the same date and time looks like this: Just understand what this will do and how you want to use it. Right now, I'm thinking of keeping the last 6 weeks of data as "seconds", and then the last 3 months as "Minutes", and then "Hours" forever after that. I'll check back in six months and see how the sizes look. Steve 

    Read the article

  • Profiling Silverlight Applications after installing Visual Studio 2010 Service Pack 1

    - by mbcrump
    Introduction Now that the dust has settled and everyone has downloaded and installed Visual Studio 2010 Service Pack 1, its time to talk about a new feature included that will help Silverlight Developers profile their applications. Let’s take a look at what the official documentation says about it: Performance Wizard for Silverlight – taken from VS2010 SP1 KB. Visual Studio 2010 SP1 enables you to tune the Silverlight application performance by profiling the code. A traditional code profiler cannot tune the rendering performance for Silverlight applications. Many higher-level profilers are added to Visual Studio 2010 SP1 so that you can better determine which parts of the application consume time. So, how do you do it? After you finish installing VS2010 SP1, make sure it took by going to Help –> About. You should see SP1Rel under Visual Studio 2010 as shown below. Now, that we have verified you are on the most current release, let’s load up a Silverlight Application. I’m going to take my hobby Silverlight project that I created a month or so ago. The reason that I’m picking this project is that I didn’t focus so much on performance as it was just built for fun and to see what I could do with Silverlight. I believe this makes the perfect application to profile.  After the project is loaded, click on Analyze then Launch Performance Wizard. Go ahead and click on CPU Sampling (recommended). You will notice that it ask which application to target. By Default, it will select the .Web project in an Silverlight Application. Go ahead and leave the default Web Project checked. We are going to leave the client as Internet Explorer. Now, go ahead and click finish. Now your Silverlight Application will launch. While your application is running, you will see the following inside of Visual Studio 2010. Here is where you will need to attach your Silverlight Application to the web application that is current being profiled. Simply click on the  Attach/Detach button below and find your application to attach to the profiler. In my case, I am using IE8 and could find it by the title. After you close your browser, you will notice it generated a report: These files will end with a .VSP If you click on the .VSP you will it generated the following report: We could turn off “Just My Code” but it may pick up things that we didn’t want to profile as shown below: One other feature to note is that you may want to export the data to a CSV or XML. You can do that by looking at the toolbar and clicking the button highlighted below. Conclusion The profiler for Silverlight is a great addition to an already great product. So before you ship a Silverlight Application run it through the profile and see what comes up. Since its included and free I can’t see a reason not to do this. Thanks again for reading and I hope you subscribe to my blog or follow me on Twitter for more Silverlight/WP7 fun.  Subscribe to my feed

    Read the article

  • Google Analytics recording event based on <a> title attribute

    - by rlsaj
    I am declaring: var title = (typeof(el.attr('title')) != 'undefined' ) ? el.attr('title') :""; and then have the following: else if (title.match(/^"Matching Content"\:/i)) { elEv.category = "Matching Content Click"; elEv.action = "click-Matching-Content"; elEv.label = href.replace(/^https?\:\/\//i, ''); elEv.non_i = true; elEv.loc = href; } However, using Google Analytics debugger this is not being recorded. Any suggestions? The complete function is: if (typeof jQuery != 'undefined') { jQuery(document).ready(function gLinkTracking($) { var filetypes = /\.(avi|csv|dat|dmg|doc.*|exe|flv|gif|jpg|mov|mp3|mp4|msi|pdf|png|ppt.*|rar|swf|txt|wav|wma|wmv|xls.*|zip)$/i; var baseHref = ''; if (jQuery('base').attr('href') != undefined) baseHref = jQuery('base').attr('href'); jQuery('a').on('click', function (event) { var el = jQuery(this); var track = true; var href = (typeof(el.attr('href')) != 'undefined' ) ? el.attr('href') :""; var title = (typeof(el.attr('title')) != 'undefined' ) ? el.attr('title') :""; var isThisDomain = href.match(document.domain.split('.').reverse()[1] + '.' + document.domain.split('.').reverse()[0]); if (!href.match(/^javascript:/i)) { var elEv = []; elEv.value=0, elEv.non_i=false; if (href.match(/^mailto\:/i)) { elEv.category = "Email link"; elEv.action = "click-email"; elEv.label = href.replace(/^mailto\:/i, ''); elEv.loc = href; } else if (title.match(/^"Matching Content"\:/i)) { elEv.category = "Matching Content Click"; elEv.action = "click-Matching-Content"; elEv.label = href.replace(/^https?\:\/\//i, ''); elEv.non_i = true; elEv.loc = href; } else if (href.match(filetypes)) { var extension = (/[.]/.exec(href)) ? /[^.]+$/.exec(href) : undefined; elEv.category = "File Downloaded"; elEv.action = "click-" + extension[0]; elEv.label = href.replace(/ /g,"-"); elEv.loc = baseHref + href; } else if (href.match(/^https?\:/i) && !isThisDomain) { elEv.category = "External link"; elEv.action = "click-external"; elEv.label = href.replace(/^https?\:\/\//i, ''); elEv.non_i = true; elEv.loc = href; } else if (href.match(/^tel\:/i)) { elEv.category = "Telephone link"; elEv.action = "click-telephone"; elEv.label = href.replace(/^tel\:/i, ''); elEv.loc = href; } else track = false; if (track) { _gaq.push(['_trackEvent', elEv.category.toLowerCase(), elEv.action.toLowerCase(), elEv.label.toLowerCase(), elEv.value, elEv.non_i]); if ( el.attr('target') == undefined || el.attr('target').toLowerCase() != '_blank') { setTimeout(function() { location.href = elEv.loc; }, 400); return false; } } } }); }); }

    Read the article

  • Algorithm design, "randomising" timetable schedule in Python although open to other languages.

    - by S1syphus
    Before I start I should add I am a musician and not a native programmer, this was undertook to make my life easier. Here is the situation, at work I'm given a new csv file each which contains a list of sound files, their length, and the minimum total amount of time they must be played. I create a playlist of exactly 60 minutes, from this excel file. Each sample played the by the minimum number of instances, but spread out from each other; so there will never be a period where for where one sound is played twice in a row or in close proximity to itself. Secondly, if the minimum instances of each song has been used, and there is still time with in the 60 min, it needs to fill the remaining time using sounds till 60 minutes is reached, while adhering to above. The smallest duration possible is 15 seconds, and then multiples of 15 seconds. Here is what I came up with in python and the problems I'm having with it, and as one user said its buggy due to the random library used in it. So I'm guessing a total rethink is on the table, here is where I need your help. Whats is the best way to solve the issue, I have had a brief look at things like knapsack and bin packing algorithms, while both are relevant neither are appropriate and maybe a bit beyond me.

    Read the article

  • MkMapView Zoom Level

    - by meetS
    I m using MkMapView with google maps.I succeed to show map view and address with annotation pin.But I want increased zoom level.How Can I Set it programmatically?? (void) showMyAddress { //Hide the keypad MKCoordinateRegion region; MKCoordinateSpan span; span.latitudeDelta=0.2; span.longitudeDelta=0.2; CLLocationCoordinate2D location = [self addressLocation]; region.span=span; region.center=location; if(addAnnotation != nil) { [mapView removeAnnotation:addAnnotation]; [addAnnotation release]; addAnnotation = nil; } addAnnotation = [[AddAnnotation alloc] initWithCoordinate:location ]; [addAnnotation setMTitle:@"abc"]; [addAnnotation setMSubTitle:@"def."] [mapView addAnnotation:addAnnotation]; [mapView setRegion:region animated:TRUE]; [mapView regionThatFits:region]; } -(CLLocationCoordinate2D) addressLocation { NSString *urlString = [NSString stringWithFormat:@"http://maps.google.com/maps/geo?q=%@&output=csv", [@"abc" stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding]]; NSString *locationString = [NSString stringWithContentsOfURL:[NSURL URLWithString:urlString] encoding:NSStringEncodingConversionAllowLossy error:nil]; NSArray *listItems = [locationString componentsSeparatedByString:@","]; double latitude = 0.0; double longitude = 0.0; if([listItems count] >= 4 && [[listItems objectAtIndex:0] isEqualToString:@"200"]) { latitude = [[listItems objectAtIndex:2] doubleValue]; longitude = [[listItems objectAtIndex:3] doubleValue]; } else { } CLLocationCoordinate2D location; location.latitude = latitude; location.longitude = longitude; return location; }

    Read the article

  • How to populate Java (web) application with initial data using Spring/JPA/Hibernate

    - by Tuukka Mustonen
    I want to setup my database with initial data programmatically. I want to populate my database for development runs, not for testing runs (it's easy). The product is built on top of Spring and JPA/Hibernate. Developer checks out the project Developer runs command/script to setup database with initial data Developer starts application (server) and begins developing/testing then: Developer runs command/script to flush the database and set it up with new initial data because database structures or the initial data bundle were changed What I want is to setup my environment by required parts in order to call my DAOs and insert new objects into database. I do not want to create initial data sets in raw SQL, XML, take dumps of database or whatever. I want to programmatically create objects and persist them in database as I would in normal application logic. One way to accomplish this would be to start up my application normally and run a special servlet that does the initialization. But is that really the way to go? I would love to execute the initial data setup as Maven task and I don't know how to do that if I take the servlet approach. There is somewhat similar question. I took a quick glance at the suggested DBUnit and Unitils. But they seem to be heavily focused in setting up testing environments, which is not what I want here. DBUnit does initial data population, but only using xml/csv fixtures, which is not what I'm after here. Then, Maven has SQL plugin, but I don't want to handle raw SQL. Maven also has Hibernate plugin, but it seems to help only in Hibernate configuration and table schema creation (not in populating db with data). How to do this?

    Read the article

  • Empty responseText from XMLHttpRequest

    - by PurplePilot
    I have written an XMLHttpRequest which runs fine but returns an empty responseText. The javascript is as follows: var anUrl = "http://api.xxx.com/rates/csv/rates.txt"; var myRequest = new XMLHttpRequest(); callAjax(anUrl); function callAjax(url) { myRequest.open("GET", url, true); myRequest.onreadystatechange = responseAjax; myRequest.setRequestHeader("Cache-Control", "no-cache"); myRequest.send(null); } function responseAjax() { if(myRequest.readyState == 4) { if(myRequest.status == 200) { result = myRequest.responseText; alert(result); alert("we made it"); } else { alert( " An error has occurred: " + myRequest.statusText); } } } The code runs fine. I can walk through and I get the readyState == 4 and a status == 200 but the responseText is always blank. I am getting a log error (in Safari debug) of Error dispatching: getProperties which I cannot seem to find reference to. I have run the code in Safari and Firefox both locally and on a remote server. The URL when put into a browser will return the string and give a status code of 200. I wrote similar code to the same URL in a Mac Widget which runs fine, but the same code in a browser never returns a result.

    Read the article

  • Impersonation in ASP.NET MVC

    - by eibrahim
    I have an Action that needs to read a file from a secure location, so I have to use impersonation to read the file. This code WORKS: [AcceptVerbs(HttpVerbs.Get)] public ActionResult DirectDownload(Guid id) { if (Impersonator.ImpersonateValidUser()) { try { var path = "path to file"; if (!System.IO.File.Exists(path)) { return View("filenotfound"); } var bytes = System.IO.File.ReadAllBytes(path); return File(bytes, "application/octet-stream", "FileName"); } catch (Exception e) { Log.Exception(e); }finally { Impersonator.UndoImpersonation(); } } return View("filenotfound"); } The only problem with the above code is that I have to read the entire file into memory and I am going to be dealing with VERY large files, so this is not a good solution. But if I replace these 2 lines: var bytes = System.IO.File.ReadAllBytes(path); return File(bytes, "application/octet-stream", "FileName"); with this: return File(path, "application/octet-stream", "FileName"); It does NOT work and I get the error message: Access to the path 'c:\projects\uploads\1\aa2bcbe7-ea99-499d-add8-c1fdac561b0e\Untitled 2.csv' is denied. I guess using the File results with a path, tries to open the file at a later time in the request pipeline when I have already "undone" the impersonation. Remember, the impersonation code works because I can read the file in the bytes array. What I want to do though is stream the file to the client. Any idea how I can work around this? Thanks in advance.

    Read the article

  • Problem with Informix JDBC, MONEY and decimal separator in string literals

    - by Michal Niklas
    I have problem with JDBC application that uses MONEY data type. When I insert into MONEY column: insert into _money_test (amt) values ('123.45') I got exception: Character to numeric conversion error The same SQL works from native Windows application using ODBC driver. I live in Poland and have Polish locale and in my country comma separates decimal part of number, so I tried: insert into _money_test (amt) values ('123,45') And it worked. I checked that in PreparedStatement I must use dot separator: 123.45. And of course I can use: insert into _money_test (amt) values (123.45) But some code is "general", it imports data from csv file and it was safe to put number into string literal. How to force JDBC to use DBMONEY (or simply dot) in literals? My workstation is WinXP. I have ODBC and JDBC Informix client in version 3.50 TC5/JC5. I have set DBMONEY to just dot: DBMONEY=. EDIT: Test code in Jython: import sys import traceback from java.sql import DriverManager from java.lang import Class Class.forName("com.informix.jdbc.IfxDriver") QUERY = "insert into _money_test (amt) values ('123.45')" def test_money(driver, db_url, usr, passwd): try: print("\n\n%s\n--------------" % (driver)) db = DriverManager.getConnection(db_url, usr, passwd) c = db.createStatement() c.execute("delete from _money_test") c.execute(QUERY) rs = c.executeQuery("select amt from _money_test") while (rs.next()): print('[%s]' % (rs.getString(1))) rs.close() c.close() db.close() except: print("there were errors!") s = traceback.format_exc() sys.stderr.write("%s\n" % (s)) print(QUERY) test_money("com.informix.jdbc.IfxDriver", 'jdbc:informix-sqli://169.0.1.225:9088/test:informixserver=ol_225;DB_LOCALE=pl_PL.CP1250;CLIENT_LOCALE=pl_PL.CP1250;charSet=CP1250', 'informix', 'passwd') test_money("sun.jdbc.odbc.JdbcOdbcDriver", 'jdbc:odbc:test', 'informix', 'passwd') Results when I run money literal with dot and comma: C:\db_examples>jython ifx_jdbc_money.py insert into _money_test (amt) values ('123,45') com.informix.jdbc.IfxDriver -------------- [123.45] sun.jdbc.odbc.JdbcOdbcDriver -------------- there were errors! Traceback (most recent call last): File "ifx_jdbc_money.py", line 16, in test_money c.execute(QUERY) SQLException: java.sql.SQLException: [Informix][Informix ODBC Driver][Informix]Character to numeric conversion error C:\db_examples>jython ifx_jdbc_money.py insert into _money_test (amt) values ('123.45') com.informix.jdbc.IfxDriver -------------- there were errors! Traceback (most recent call last): File "ifx_jdbc_money.py", line 16, in test_money c.execute(QUERY) SQLException: java.sql.SQLException: Character to numeric conversion error sun.jdbc.odbc.JdbcOdbcDriver -------------- [123.45]

    Read the article

  • Exporting from SSRS 2008 ReportViewer to Excel Causes Duplicate Columns

    - by Daniel Coffman
    I have a report that groups months by quarters, so each quarter has three months and the display of the months under the quarter is toggled by the quarter header. It looks just fine in the ReportViewer, but when exporting to Excel the first month in the quarter with data is duplicated and appended to the end of the quarter group. Here is what it looks like in the ReportViewer (with Quarters 2 and 4 expanded, note May and June do not have any data and show blank columns by design): http://i.imgur.com/MykZE.png This is how it looks when exported to Excel: http://i.imgur.com/zfLuk.png The collapsed Quarter should only show the LAST month in the quarter. You can see that in the Excel export July is inserted in Q1 even though it should be hidden entirely since that quarter is collapsed, December is appended to Q2, January is inserted into Q3, and April is duplicated and appended to Q4. Exporting the any format OTHER than Excel works correctly and does not insert these columns. A similar bug for rows was filed and marked as "by design": http://connect.microsoft.com/SQLServer/feedback/details/508823/reporting-services-2008-group-by-export-to-excel-duplicate-rows-csv-ok-pdf-ok How do I stop the export to Excel feature from inserting these duplicate columns?

    Read the article

  • Seems doctrine listener is not fired

    - by Roel Veldhuizen
    Got a service which should be executed the moment an object is persisted. Though, I think the code looks like it should work, it doesn't. I configured the service like the following yml. services: bla_orm.listener: class: Bla\OrmBundle\EventListener\UserManager arguments: [@security.encoder_factory] tags: - { name: doctrine.event_listener, event: prePersist } The class: namespace Bla\OrmBundle\EventListener; use Doctrine\ORM\Event\LifecycleEventArgs; use Bla\OrmBundle\Entity\User; class UserManager { protected $encoderFactory; public function __construct(\Symfony\Component\Security\Core\Encoder\EncoderFactoryInterface $encoderFactory) { $this->encoderFactory = $encoderFactory; } public function prePersist(LifecycleEventArgs $args) { $entity = $args->getEntity(); if ($entity instanceof User) { $encoder = $this->encoderFactory ->getEncoder($entity); $entity->setSalt(rand(10000, 99999)); $password = $encoder->encodePassword($entity->getPassword(), $entity->getSalt()); $entity->setPassword($password); } } } Symfony version: Symfony version 2.3.3 - app/dev/debug Output of container:debug [container] Public services Service Id Scope Class Name annotation_reader container Doctrine\Common\Annotations\FileCacheReader assetic.asset_manager container Assetic\Factory\LazyAssetManager assetic.controller prototype Symfony\Bundle\AsseticBundle\Controller\AsseticController assetic.filter.cssrewrite container Assetic\Filter\CssRewriteFilter assetic.filter_manager container Symfony\Bundle\AsseticBundle\FilterManager assetic.request_listener container Symfony\Bundle\AsseticBundle\EventListener\RequestListener cache_clearer container Symfony\Component\HttpKernel\CacheClearer\ChainCacheClearer cache_warmer container Symfony\Component\HttpKernel\CacheWarmer\CacheWarmerAggregate data_collector.request container Symfony\Component\HttpKernel\DataCollector\RequestDataCollector data_collector.router container Symfony\Bundle\FrameworkBundle\DataCollector\RouterDataCollector database_connection n/a alias for doctrine.dbal.default_connection debug.controller_resolver container Symfony\Component\HttpKernel\Controller\TraceableControllerResolver debug.deprecation_logger_listener container Symfony\Component\HttpKernel\EventListener\ErrorsLoggerListener debug.emergency_logger_listener container Symfony\Component\HttpKernel\EventListener\ErrorsLoggerListener debug.event_dispatcher container Symfony\Component\HttpKernel\Debug\TraceableEventDispatcher debug.stopwatch container Symfony\Component\Stopwatch\Stopwatch debug.templating.engine.php container Symfony\Bundle\FrameworkBundle\Templating\TimedPhpEngine debug.templating.engine.twig n/a alias for templating doctrine container Doctrine\Bundle\DoctrineBundle\Registry doctrine.dbal.connection_factory container Doctrine\Bundle\DoctrineBundle\ConnectionFactory doctrine.dbal.default_connection container stdClass doctrine.orm.default_entity_manager container Doctrine\ORM\EntityManager doctrine.orm.default_manager_configurator container Doctrine\Bundle\DoctrineBundle\ManagerConfigurator doctrine.orm.entity_manager n/a alias for doctrine.orm.default_entity_manager doctrine.orm.validator.unique container Symfony\Bridge\Doctrine\Validator\Constraints\UniqueEntityValidator doctrine.orm.validator_initializer container Symfony\Bridge\Doctrine\Validator\DoctrineInitializer event_dispatcher container Symfony\Component\EventDispatcher\ContainerAwareEventDispatcher file_locator container Symfony\Component\HttpKernel\Config\FileLocator filesystem container Symfony\Component\Filesystem\Filesystem form.csrf_provider container Symfony\Component\Form\Extension\Csrf\CsrfProvider\SessionCsrfProvider form.factory container Symfony\Component\Form\FormFactory form.registry container Symfony\Component\Form\FormRegistry form.resolved_type_factory container Symfony\Component\Form\ResolvedFormTypeFactory form.type.birthday container Symfony\Component\Form\Extension\Core\Type\BirthdayType form.type.button container Symfony\Component\Form\Extension\Core\Type\ButtonType form.type.checkbox container Symfony\Component\Form\Extension\Core\Type\CheckboxType form.type.choice container Symfony\Component\Form\Extension\Core\Type\ChoiceType form.type.collection container Symfony\Component\Form\Extension\Core\Type\CollectionType form.type.country container Symfony\Component\Form\Extension\Core\Type\CountryType form.type.currency container Symfony\Component\Form\Extension\Core\Type\CurrencyType form.type.date container Symfony\Component\Form\Extension\Core\Type\DateType form.type.datetime container Symfony\Component\Form\Extension\Core\Type\DateTimeType form.type.email container Symfony\Component\Form\Extension\Core\Type\EmailType form.type.entity container Symfony\Bridge\Doctrine\Form\Type\EntityType form.type.file container Symfony\Component\Form\Extension\Core\Type\FileType form.type.form container Symfony\Component\Form\Extension\Core\Type\FormType form.type.hidden container Symfony\Component\Form\Extension\Core\Type\HiddenType form.type.integer container Symfony\Component\Form\Extension\Core\Type\IntegerType form.type.language container Symfony\Component\Form\Extension\Core\Type\LanguageType form.type.locale container Symfony\Component\Form\Extension\Core\Type\LocaleType form.type.money container Symfony\Component\Form\Extension\Core\Type\MoneyType form.type.number container Symfony\Component\Form\Extension\Core\Type\NumberType form.type.password container Symfony\Component\Form\Extension\Core\Type\PasswordType form.type.percent container Symfony\Component\Form\Extension\Core\Type\PercentType form.type.radio container Symfony\Component\Form\Extension\Core\Type\RadioType form.type.repeated container Symfony\Component\Form\Extension\Core\Type\RepeatedType form.type.reset container Symfony\Component\Form\Extension\Core\Type\ResetType form.type.search container Symfony\Component\Form\Extension\Core\Type\SearchType form.type.submit container Symfony\Component\Form\Extension\Core\Type\SubmitType form.type.text container Symfony\Component\Form\Extension\Core\Type\TextType form.type.textarea container Symfony\Component\Form\Extension\Core\Type\TextareaType form.type.time container Symfony\Component\Form\Extension\Core\Type\TimeType form.type.timezone container Symfony\Component\Form\Extension\Core\Type\TimezoneType form.type.url container Symfony\Component\Form\Extension\Core\Type\UrlType form.type_extension.csrf container Symfony\Component\Form\Extension\Csrf\Type\FormTypeCsrfExtension form.type_extension.form.http_foundation container Symfony\Component\Form\Extension\HttpFoundation\Type\FormTypeHttpFoundationExtension form.type_extension.form.validator container Symfony\Component\Form\Extension\Validator\Type\FormTypeValidatorExtension form.type_extension.repeated.validator container Symfony\Component\Form\Extension\Validator\Type\RepeatedTypeValidatorExtension form.type_extension.submit.validator container Symfony\Component\Form\Extension\Validator\Type\SubmitTypeValidatorExtension form.type_guesser.doctrine container Symfony\Bridge\Doctrine\Form\DoctrineOrmTypeGuesser form.type_guesser.validator container Symfony\Component\Form\Extension\Validator\ValidatorTypeGuesser fragment.handler container Symfony\Component\HttpKernel\Fragment\FragmentHandler fragment.listener container Symfony\Component\HttpKernel\EventListener\FragmentListener fragment.renderer.hinclude container Symfony\Bundle\FrameworkBundle\Fragment\ContainerAwareHIncludeFragmentRenderer fragment.renderer.inline container Symfony\Component\HttpKernel\Fragment\InlineFragmentRenderer http_kernel container Symfony\Component\HttpKernel\DependencyInjection\ContainerAwareHttpKernel kernel container locale_listener container Symfony\Component\HttpKernel\EventListener\LocaleListener logger container Symfony\Bridge\Monolog\Logger mailer n/a alias for swiftmailer.mailer.default monolog.handler.chromephp container Symfony\Bridge\Monolog\Handler\ChromePhpHandler monolog.handler.debug container Symfony\Bridge\Monolog\Handler\DebugHandler monolog.handler.firephp container Symfony\Bridge\Monolog\Handler\FirePHPHandler monolog.handler.main container Monolog\Handler\StreamHandler monolog.logger.deprecation container Symfony\Bridge\Monolog\Logger monolog.logger.doctrine container Symfony\Bridge\Monolog\Logger monolog.logger.emergency container Symfony\Bridge\Monolog\Logger monolog.logger.event container Symfony\Bridge\Monolog\Logger monolog.logger.profiler container Symfony\Bridge\Monolog\Logger monolog.logger.request container Symfony\Bridge\Monolog\Logger monolog.logger.router container Symfony\Bridge\Monolog\Logger monolog.logger.security container Symfony\Bridge\Monolog\Logger monolog.logger.templating container Symfony\Bridge\Monolog\Logger profiler container Symfony\Component\HttpKernel\Profiler\Profiler profiler_listener container Symfony\Component\HttpKernel\EventListener\ProfilerListener property_accessor container Symfony\Component\PropertyAccess\PropertyAccessor request request response_listener container Symfony\Component\HttpKernel\EventListener\ResponseListener router container Symfony\Bundle\FrameworkBundle\Routing\Router router_listener container Symfony\Component\HttpKernel\EventListener\RouterListener routing.loader container Symfony\Bundle\FrameworkBundle\Routing\DelegatingLoader security.context container Symfony\Component\Security\Core\SecurityContext security.encoder_factory container Symfony\Component\Security\Core\Encoder\EncoderFactory security.firewall container Symfony\Component\Security\Http\Firewall security.firewall.map.context.dev container Symfony\Bundle\SecurityBundle\Security\FirewallContext security.firewall.map.context.login container Symfony\Bundle\SecurityBundle\Security\FirewallContext security.firewall.map.context.rest container Symfony\Bundle\SecurityBundle\Security\FirewallContext security.firewall.map.context.secured_area container Symfony\Bundle\SecurityBundle\Security\FirewallContext security.rememberme.response_listener container Symfony\Component\Security\Http\RememberMe\ResponseListener security.secure_random container Symfony\Component\Security\Core\Util\SecureRandom security.validator.user_password container Symfony\Component\Security\Core\Validator\Constraints\UserPasswordValidator sensio.distribution.webconfigurator n/a alias for sensio_distribution.webconfigurator sensio_distribution.webconfigurator container Sensio\Bundle\DistributionBundle\Configurator\Configurator sensio_framework_extra.cache.listener container Sensio\Bundle\FrameworkExtraBundle\EventListener\CacheListener sensio_framework_extra.controller.listener container Sensio\Bundle\FrameworkExtraBundle\EventListener\ControllerListener sensio_framework_extra.converter.datetime container Sensio\Bundle\FrameworkExtraBundle\Request\ParamConverter\DateTimeParamConverter sensio_framework_extra.converter.doctrine.orm container Sensio\Bundle\FrameworkExtraBundle\Request\ParamConverter\DoctrineParamConverter sensio_framework_extra.converter.listener container Sensio\Bundle\FrameworkExtraBundle\EventListener\ParamConverterListener sensio_framework_extra.converter.manager container Sensio\Bundle\FrameworkExtraBundle\Request\ParamConverter\ParamConverterManager sensio_framework_extra.view.guesser container Sensio\Bundle\FrameworkExtraBundle\Templating\TemplateGuesser sensio_framework_extra.view.listener container Sensio\Bundle\FrameworkExtraBundle\EventListener\TemplateListener service_container container session container Symfony\Component\HttpFoundation\Session\Session session.handler container Symfony\Component\HttpFoundation\Session\Storage\Handler\NativeFileSessionHandler session.storage n/a alias for session.storage.native session.storage.filesystem container Symfony\Component\HttpFoundation\Session\Storage\MockFileSessionStorage session.storage.native container Symfony\Component\HttpFoundation\Session\Storage\NativeSessionStorage session.storage.php_bridge container Symfony\Component\HttpFoundation\Session\Storage\PhpBridgeSessionStorage session_listener container Symfony\Bundle\FrameworkBundle\EventListener\SessionListener streamed_response_listener container Symfony\Component\HttpKernel\EventListener\StreamedResponseListener swiftmailer.email_sender.listener container Symfony\Bundle\SwiftmailerBundle\EventListener\EmailSenderListener swiftmailer.mailer n/a alias for swiftmailer.mailer.default swiftmailer.mailer.default container Swift_Mailer swiftmailer.mailer.default.plugin.messagelogger container Swift_Plugins_MessageLogger swiftmailer.mailer.default.spool container Swift_FileSpool swiftmailer.mailer.default.transport container Swift_Transport_SpoolTransport swiftmailer.mailer.default.transport.real container Swift_Transport_EsmtpTransport swiftmailer.plugin.messagelogger n/a alias for swiftmailer.mailer.default.plugin.messagelogger swiftmailer.spool n/a alias for swiftmailer.mailer.default.spool swiftmailer.transport n/a alias for swiftmailer.mailer.default.transport swiftmailer.transport.real n/a alias for swiftmailer.mailer.default.transport.real templating container Symfony\Bundle\TwigBundle\Debug\TimedTwigEngine templating.asset.package_factory container Symfony\Bundle\FrameworkBundle\Templating\Asset\PackageFactory templating.filename_parser container Symfony\Bundle\FrameworkBundle\Templating\TemplateFilenameParser templating.globals container Symfony\Bundle\FrameworkBundle\Templating\GlobalVariables templating.helper.actions container Symfony\Bundle\FrameworkBundle\Templating\Helper\ActionsHelper templating.helper.assets request Symfony\Component\Templating\Helper\CoreAssetsHelper templating.helper.code container Symfony\Bundle\FrameworkBundle\Templating\Helper\CodeHelper templating.helper.form container Symfony\Bundle\FrameworkBundle\Templating\Helper\FormHelper templating.helper.logout_url container Symfony\Bundle\SecurityBundle\Templating\Helper\LogoutUrlHelper templating.helper.request container Symfony\Bundle\FrameworkBundle\Templating\Helper\RequestHelper templating.helper.router container Symfony\Bundle\FrameworkBundle\Templating\Helper\RouterHelper templating.helper.security container Symfony\Bundle\SecurityBundle\Templating\Helper\SecurityHelper templating.helper.session container Symfony\Bundle\FrameworkBundle\Templating\Helper\SessionHelper templating.helper.slots container Symfony\Component\Templating\Helper\SlotsHelper templating.helper.translator container Symfony\Bundle\FrameworkBundle\Templating\Helper\TranslatorHelper templating.loader container Symfony\Bundle\FrameworkBundle\Templating\Loader\FilesystemLoader templating.name_parser container Symfony\Bundle\FrameworkBundle\Templating\TemplateNameParser translation.dumper.csv container Symfony\Component\Translation\Dumper\CsvFileDumper translation.dumper.ini container Symfony\Component\Translation\Dumper\IniFileDumper translation.dumper.mo container Symfony\Component\Translation\Dumper\MoFileDumper translation.dumper.php container Symfony\Component\Translation\Dumper\PhpFileDumper translation.dumper.po container Symfony\Component\Translation\Dumper\PoFileDumper translation.dumper.qt container Symfony\Component\Translation\Dumper\QtFileDumper translation.dumper.res container Symfony\Component\Translation\Dumper\IcuResFileDumper translation.dumper.xliff container Symfony\Component\Translation\Dumper\XliffFileDumper translation.dumper.yml container Symfony\Component\Translation\Dumper\YamlFileDumper translation.extractor container Symfony\Component\Translation\Extractor\ChainExtractor translation.extractor.php container Symfony\Bundle\FrameworkBundle\Translation\PhpExtractor translation.loader container Symfony\Bundle\FrameworkBundle\Translation\TranslationLoader translation.loader.csv container Symfony\Component\Translation\Loader\CsvFileLoader translation.loader.dat container Symfony\Component\Translation\Loader\IcuResFileLoader translation.loader.ini container Symfony\Component\Translation\Loader\IniFileLoader translation.loader.mo container Symfony\Component\Translation\Loader\MoFileLoader translation.loader.php container Symfony\Component\Translation\Loader\PhpFileLoader translation.loader.po container Symfony\Component\Translation\Loader\PoFileLoader translation.loader.qt container Symfony\Component\Translation\Loader\QtFileLoader translation.loader.res container Symfony\Component\Translation\Loader\IcuResFileLoader translation.loader.xliff container Symfony\Component\Translation\Loader\XliffFileLoader translation.loader.yml container Symfony\Component\Translation\Loader\YamlFileLoader translation.writer container Symfony\Component\Translation\Writer\TranslationWriter translator n/a alias for translator.default translator.default container Symfony\Bundle\FrameworkBundle\Translation\Translator twig container Twig_Environment twig.controller.exception container Symfony\Bundle\TwigBundle\Controller\ExceptionController twig.exception_listener container Symfony\Component\HttpKernel\EventListener\ExceptionListener twig.loader container Symfony\Bundle\TwigBundle\Loader\FilesystemLoader twig.translation.extractor container Symfony\Bridge\Twig\Translation\TwigExtractor uri_signer container Symfony\Component\HttpKernel\UriSigner bla_orm.listener container Bla\OrmBundle\EventListener\UserManager validator container Symfony\Component\Validator\Validator web_profiler.controller.exception container Symfony\Bundle\WebProfilerBundle\Controller\ExceptionController web_profiler.controller.profiler container Symfony\Bundle\WebProfilerBundle\Controller\ProfilerController web_profiler.controller.router container Symfony\Bundle\WebProfilerBundle\Controller\RouterController web_profiler.debug_toolbar container Symfony\Bundle\WebProfilerBundle\EventListener\WebDebugToolbarListener Update It seems that the listener is not invoked when an updateAction, generated by generate:doctrine:crud has taken place though. At another part of the code the lister seems to be invoked. Though, there are both Controller types and both us $em->persist($something); $em->flush(); to save the changes. I would expect that in both cases the listener is invoked.

    Read the article

< Previous Page | 52 53 54 55 56 57 58 59 60 61 62 63  | Next Page >