Closed Thread Icon

Topic awaiting preservation: Anyone used IMAP Proxy???? (Page 1 of 1) Pages that link to <a href="https://ozoneasylum.com/backlink?for=30748" title="Pages that link to Topic awaiting preservation: Anyone used IMAP Proxy???? (Page 1 of 1)" rel="nofollow" >Topic awaiting preservation: Anyone used IMAP Proxy???? <span class="small">(Page 1 of 1)</span>\

 
paritycheck
Bipolar (III) Inmate

From: you tell me
Insane since: Mar 2004

posted posted 01-07-2009 17:08

Hi guys I was wondering has anyone used an IMAP Proxy like the one on http://www.imapproxy.org/. If so how do you set it up and how much of an improvement is it?

Tyberius Prime
Maniac (V) Mad Scientist with Finglongers

From: Germany
Insane since: Sep 2001

posted posted 01-08-2009 10:21

I don't even see why you'd use something like this? Seriously, login on an IMAP server
shouldn't take too long, compared with finding 50.000 mails in your inbox .

(Note: IMAP is one of the worst protocols on the internet. That it works at all is suprising,
that generally implementations nowerdays work almost flawlessly nothing short of a miracle.
I remember the times when there was one IMAP client you could call 'working'.)

paritycheck
Bipolar (III) Inmate

From: you tell me
Insane since: Mar 2004

posted posted 01-08-2009 21:32

Well its not me actually the project we are working on utilizes roundcube mail. And its really slow so we need to speed it up - my boss wants it to be almost as quick as outlook. The thing is that well outlook downlaods all emails to a local file system. The IMAP PROXY is just a temporary fix and so far we checked its only available for linux systems - DAYAM MICROSOFT AGAIN!

The other idea we had was to rewrite the roundcube classes so they would read the emails from a local database. This might remove the need for connecting to the IMAP server at all except for downloading new emails. However its the part on downloading new emails that has me perplexed. I thought of creating a Php script that would be called via an ajax call or run in the background to check for new messages and download them periodically however there is the risk of the connection breaking and losing messages midway. The aim is to get our application to function like a pop3 and download all messages form our email imap server into our database. Reading from the database is no issue at all - however the major problem lies now in writing to the database i..e downloading the emails - how do we get around this. Do I need to start a new topic or could we carry on with this in this thread?

Tyberius Prime
Maniac (V) Mad Scientist with Finglongers

From: Germany
Insane since: Sep 2001

posted posted 01-09-2009 09:15

huh? by checking for each mail in the inbox if it's already in the database, if it isn't, upload it. Start going 'forwards' in time from the date of the newest message you have in the database. No deletion on the IMAP needed, I believe, and no chance of loosing messages.


That imap proxy won't cache messages either.
And forget about this ever being as quick as outlook. Imap servers just ain't.

paritycheck
Bipolar (III) Inmate

From: you tell me
Insane since: Mar 2004

posted posted 01-10-2009 07:26

Hmm well teh idea is to clean up our IMAP server - the thing is that how do we implement the writing to the databse part i.e. retrieve message from inbox and write it to teh database. I mean theres teh risk of losing a connection or time out problems - lets like suppose and this is a common occurrence that we end up having almost 100s of emails coming in every 5 minutes. DOwnloading a 100 emails every minute is in itself a tall order which I don't think woudl be wise to be implemented using an ajax call. Any other way to do this? Like I heard about using a cron job or so - but how would I get about to do it - we're running on windows platform and whatever process we choose must be configurable in code.

paritycheck
Bipolar (III) Inmate

From: you tell me
Insane since: Mar 2004

posted posted 01-12-2009 14:52

Come on guys I really need some ideas for this - Im running outta time my deadlines in 48 hours

paritycheck
Bipolar (III) Inmate

From: you tell me
Insane since: Mar 2004

posted posted 01-12-2009 16:31

OK I have one idea - just tell me if its a good idea. Im thinking of having a simple script run in a hidden iframe. The script would downlaod all teh emails form the IMAP server and upon inserting it into the database it woudl out put some javascript which would refresh teh script in teh iframe again after X seconds to check for and download newer messages.

The thing is that in case of a timeout is there a way to catch a timeout and restart the script again from the beginning rather than just dying out...

Come on guys its kinda pointless having to discuss a project on your own... Anyone out there.. assistance requested

Tyberius Prime
Maniac (V) Mad Scientist with Finglongers

From: Germany
Insane since: Sep 2001

posted posted 01-12-2009 16:47

I am sorry parity, but you're teeming up this horse from the wrong end.
(and there's *no* benefit using an iframe instead of ajax. But the ajax can tell
you to refresh the client side inbox as well).

All client side initiated requests are subject to 'timing out'/being cancelled at any given time.
So is anything called by a cron job or the windows schedulder equivalent.
So is actually anything running on your server - because the server might go down, or the network connection
is interrupted or what not.

You need to be able to handle an interruption during download of each single e-mail. Once you've done
that you can choose any of the methods you have proposed to actually run your mail copying (provided
you can provide a way to ensure that there aren't multiple copise trying to download the same e-mails
at the same time. A central 'connection managment' instance comes to mind, that keeps a imap-box->open connection
list...).

Basicially, for each given mailbox-imap box connection you need to do the following:
-find the last e-mail you have sucessfully copied (or at least, have a token that the next step will accept).
-ask the server for all newer e-mails.
-start with the oldest one in the list:
-Write down that you're going to retrieve this e-mail (note 1)
-retrieve the e-mail.
-note that you have retrieved the e-mail, and will now put it in the database(note 2).
-Remove note 1.
-store e-mail in the database (now becomes 'last sucessfully retreived e-mail')
-remove note 2.
-proceed with next e-mail in list.
-if list is emtpy, refresh list.

anytime your code starts, it starts checking backwards:
if there's a note 2, we have an e-mail to push into the database.
if there's a note 1, we know what e-mail to retrieve.
otherwise, we work from the bottom of our list.
if our list runs empty, we fetch the new e-mails.


Once you have established such an interruptible workflow,
(read up on journaling systems), no matter what happens, the system
will start retrieving messages again afterwards.

Then you can put the actuall calls easily in a continously refreshing
ajax call.


so long,
->Tyberius Prime

ps: 100 messages/minute is no real load for a scalable e-mail system...

paritycheck
Bipolar (III) Inmate

From: you tell me
Insane since: Mar 2004

posted posted 01-13-2009 08:57

Ok that pretty much set everything in perspective. So right now what I've done so far is pretty much separate the process of downloading emails into the db from the IMAP server from the actual mail program that reads teh db as though it was a mailbox.

It pretty much made sense that why would u want to restrict downloading of emails to the db only when someone is browsing your website. The entire aim seems self defeated with the iframe idea though. Because the point was to access the db as though it was a mailbox and remove all imap calls from the client interface.

The logic I've used so far is that the downloading script would be called using a windows scheduler and it would check downloaded messages against a certain message unique ID which is said to be unique to each and every email created its a huge ugly string but this check is pretty secondary as the aim is that once we download the email and its attachments we would delete the email fro good from the online IMAP server - so the logic goes like this:

code:
Get all emails from imap server - assuming that if an email exists it wasnt downloaded and deleted
For each allemails as one email
  does the message id of this email exist btw?

  If yes 
    donot download this email and move on
  else
    download this email

  If this email has attachments
  {
    download attachments and associate to teh email  
  }else{
    go on
  }


  // if we get till here then only would we delete the email
  delete the email from the imap server
loop



The only pitfall woudl be in teh case that should this cease midway there can be two situations
we have downloaded an email but not one or all of its attachments
We downloaded but couldn't delete

To counter that the fact that deletion is only possible should everything be downloaded - running a simple check to make sure we don't have an existing email in our db online should solve the problem.

The thing is that how do I set up a scheduler to run in such a way that when it makes periodic calls to run the script it doesn't make a call whilst the script hasn't finished running. In which case we would have two instances of teh program running most probably... or wait isnt that supposed to be a good thing?

paritycheck
Bipolar (III) Inmate

From: you tell me
Insane since: Mar 2004

posted posted 01-13-2009 09:27

There is one thing though - the downloading should check for new updates every few minutes - I dont think Windows scheduled task has anything that could be scheduled in such a short period...

Tyberius Prime
Maniac (V) Mad Scientist with Finglongers

From: Germany
Insane since: Sep 2001

posted posted 01-13-2009 14:39

though I don't believe your last point, you can either set up a task every x minutes, or have an 'eternally running' service that simply pauses for a few minutes each time (per mailbox!... next layer of looping...) (and get's restarted if it ever dies), but you are going to need to make sure that only one of them is running at a given time.

Seriously, what the heck are you building? How many mailboxes are you (planning on) retrieving? Have you looked into fetchmail - that would handle anything but putting your mails into the database. Have you considered using the standard maildir format for local storage?
How's this ever going to make money (I suppose from your posts that this is not a personal project?)

so long,

->Tyberius Prime

paritycheck
Bipolar (III) Inmate

From: you tell me
Insane since: Mar 2004

posted posted 01-14-2009 18:02

Well I fixed it up as you stated - now I have two code snippets working one downloads the emails to the message db. The webclient just checks the db as usual and when it finds an email marked recent bingo it grabs it - the idea is to make this webmail client work as fast as possible. You might think its pretty unnecessary though but the end result is that after rigging it up to read from the database rather than an imap server and having an independent script called via a scheduled task every couple of minutes to populate the db with new emails - the speed is so amazing it even beats outlook!

I think I'll put together a simple tutorial on how to turn roundcubemail form an IMAP web client into a simple interface for a db based web client.

Well its kinda a loooooong story on what the project is as its kinda amoeba like in the sense that every now and then my boss has a new idea to build upon and then once thats done we have a new idea to build upon and likewise this process goes on and on - we had a deadline to finish this project in under 3 months - considering that its almost 2 years since we started - I guess you have an idea on how many new mutations this thing has gone through to reach its current shape . It might seem pointless and frankly from what I understand a simple solution that does a simple task would be commercially viable as opposed to a humongous suite of programs that do everything most of which is stuff that people would hardly use - but hey Im not complaining in teh sense that I'm gettin gpaid good and teh experience is amazing - I must have worked in programming regions other noobs must have feared to venture and heck it feels so great when you talk to your programmer friends about what you're working on and you get replies like 'Are you crazy', 'U DOING THAT IN PHP??? IS THAT EVEN POSSIBLE!?!?!?!' - at the end of the day its a nifty piece of work to show off.

Its pretty much another humongous collaboration system - just about it really only it hasnt assumed a final shape as of yet.

Laughing Otter
Obsessive-Compulsive (I) Inmate

From:
Insane since: Feb 2013

posted posted 02-06-2013 21:25

I know this is a bit old, but I am using imapproxy in a web-app environment. Configuring it just involved filling in the blanks in the config file, and it has significantly improved the performance i.e: first thing in the morning it takes about 15 seconds to connect to the server, but subsequent connections are more or less immediate.

Plus it was the only way I could get PHP to connect to Exchange via IMAP. The Exchange handshaking is apparently non-standard, but the proxy can handle it where PHP itself can't.

Now if I could only get Exchange to stop munging appended multipart messages to text/plain I'd be set.

« BackwardsOnwards »

Show Forum Drop Down Menu